-
Emotion effects on cognition have often been reported. However, only few studies investigated emotional effects on subsequent language processing, and in most cases these effects were induced by non-linguistic stimuli such as films, faces, or pictures. Here, we investigated how a paragraph of positive, negative, or neutral emotional valence affects the processing of a subsequent emotionally neutral sentence, which contained either semantic, syntactic, or no violation, respectively, by means of event-related brain potentials (ERPs). Behavioral data revealed strong effects of emotion; error rates and reaction times increased significantly in sentences preceded by a positive paragraph relative to negative and neutral ones. In ERPs, the N400 to semantic violations was not affected by emotion. In the syntactic experiment, however, clear emotion effects were observed on ERPs. The left anterior negativity (LAN) to syntactic violations, which was not visible in the neutral condition, was present in the negative and positive conditions. This is interpreted as reflecting modulatory effects of prior emotions on syntactic processing, which is discussed in the light of three alternative or complementary explanations based on emotion-induced cognitive styles, working memory, and arousal models. The present effects of emotion on the LAN are especially remarkable considering that syntactic processing has often been regarded as encapsulated and autonomous.
-
We studied the case of transparent word labels (e.g., 'push') placed on glass doors, when viewed from the other side as mirror-reversed script, hence requiring an action opposite to word meaning. As compared with a regular view, labels seen 'from the other side' in the glass door situation caused strong delays of actions and a tripling of error rates. This problem is unrelated to mirror reading but is at least partially due to the need to act opposite to word meaning. The glass door effect was not related to practice and age and showed no adaptation effect after incompatible trials. Distribution analyses showed an increased correct reaction time (RT) effect for slower responses, whereas accuracy effects were specific for fast responses. In the literature, problems with such mixed mappings have often been interpreted in the sense of competing action tendencies. Experiments 1 to 4, however, demonstrated that this might merely be a task difficulty effect due to the necessity for a mental transformation in case of mirror-reversed labels. Moreover, our results strongly advocate against using transparent labels because they may pose a considerable risk.
-
In so-called garden-path jokes, an initial semantic representation is violated, and semantic revision reestablishes a coherent representation. 48 jokes were manipulated in three conditions: (i) a coherent ending, (ii) a joke ending, and (iii) a discourse-incoherent ending. A reading times study (N = 24) and three studies with recordings of ERP and pupil changes (N = 21, 24, and 24, respectively) supported the hypothesized cognitive processes. Jokes showed increased reading times of the final word compared to coherent endings. ERP data mainly indicated semantic integration difficulties (N400). Larger pupil diameters to joke endings presumably reflect emotional responses. ERP evidence for increased discourse processing efforts and emotional responses, as assumed to be reflected in modulations of the late left anterior negativity (LLAN) and in an enhanced late frontal positivity (fP600), respectively, remains however incomplete. Processing of incoherent endings was also accompanied by increased reading times, a stronger and sustained N400, and context-sensitive P600 effects. Together, these findings provide evidence for a sequential, non-monotonic, and incremental discourse comprehension of garden-path jokes.
-
For emotional pictures with fear-, disgust-, or sex-related contents, stimulus size has been shown to increase emotion effects in attention-related event-related potentials (ERPs), presumably reflecting the enhanced biological impact of larger emotion-inducing pictures. If this is true, size should not enhance emotion effects for written words with symbolic and acquired meaning. Here, we investigated ERP effects of font size for emotional and neutral words. While P1 and N1 amplitudes were not affected by emotion, the early posterior negativity started earlier and lasted longer for large relative to small words. These results suggest that emotion-driven facilitation of attention is not necessarily based on biological relevance, but might generalize to stimuli with arbitrary perceptual features. This finding points to the high relevance of written language in today’s society as an important source of emotional meaning.
-
Facial attractiveness is of high importance for human interaction and communication, and everyday experience suggests that the mere aspect of a face elicits spontaneous appraisal of attractiveness. However, little is known about the time course of brain responses related to this process. In the present study, event-related brain potentials were recorded during attractiveness classification of facial portraits that were standardized with respect to facial expression. The faces were either preceded by another face of high or low attractiveness or by an affectively neutral object. Attractive as opposed to non-attractive target faces elicited an early posterior negativity (EPN; ∼250 ms) and a late parietal positivity (LPC; 400-600 ms), which were not modulated by affectively congruent prime faces. Elevated LPC activity had previously been shown in response to attractive versus non-attractive faces, possibly reflecting task-related evaluative processes. An enhanced EPN had been reported for faces with emotional compared to neutral emotional expression, and related to facilitated selection of emotional information. Extending these findings, our study is the first to report an attractiveness-related ERP modulation prior to the LPC, suggesting that appraising facial attractiveness starts already at processing stages associated with stimulus selection.
-
We used facial EMG to examine reactions to the attractiveness of natural (faces) and artificial (abstract patterns) stimuli under long and short presentation durations. Attractive stimuli produced strong activations of the M. zygomaticus major muscle, indicating positive affective reactions; and unattractive stimuli produced strong activations of the M. corrugator supercili muscle, indicating negative affective reactions. Fluency effects, indicated by stronger activations of the M. zygomaticus major under the longer presentation duration were, however, only found for the abstract patterns. Moreover, the abstract patterns also were associated with more consistent activations over time than the faces, suggesting differences in the processes underlying the evaluation of faces and patterns. We discuss these results in terms of differences in appraisal processes between the two classes of stimuli—the greater biological, social, and sociosexual significance of faces trigger more complex appraisals than the abstract patterns.
-
Emotion effects in event-related brain potentials (ERPs) have previously been reported for a range of visual stimuli, including emotional words, pictures, and facial expressions. Still, little is known about the actual comparability of emotion effects across these stimulus classes. The present study aimed to fill this gap by investigating emotion effects in response to words, pictures, and facial expressions using a blocked within-subject design. Furthermore, ratings of stimulus arousal and valence were collected from an independent sample of participants. Modulations of early posterior negativity (EPN) and late positive complex (LPC) were visible for all stimulus domains, but showed clear differences, particularly in valence processing. While emotion effects were limited to positive stimuli for words, they were predominant for negative stimuli in pictures and facial expressions.These findings corroborate the notion of a positivity offset for words and a negativity bias for pictures and facial expressions, which was assumed to be caused by generally lower arousal levels of written language. Interestingly, however, these assumed differences were not confirmed by arousal ratings. Instead, words were rated as overall more positive than pictures and facial expressions. Taken together, the present results point toward systematic differences in the processing of written words and pictorial stimuli of emotional content, not only in terms of a valence bias evident in ERPs, but also concerning their emotional evaluation captured by ratings of stimulus valence and arousal.
-
Recent research suggests that emotion effects in word processing resemble those in other stimulus domains such as pictures or faces. The present study aims to provide more direct evidence for this notion by comparing emotion effects in word and face processing in a within-subject design. Event-related brain potentials (ERPs) were recorded as participants made decisions on the lexicality of emotionally positive, negative, and neutral German verbs or pseudowords, and on the integrity of intact happy, angry, and neutral faces or slightly distorted faces. Relative to neutral and negative stimuli both positive verbs and happy faces elicited posterior ERP negativities that were indistinguishable in scalp distribution and resembled the early posterior negativities reported by others. Importantly, these ERP modulations appeared at very different latencies. Therefore, it appears that similar brain systems reflect the decoding of both biological and symbolic emotional signals of positive valenc
-
On the basis of current emotion theories and functional and neurophysiological ties between the processing of conflicts and errors on the one hand and errors and emotions on the other hand we predicted that conflicts between prepotent Go responses and occasional NoGo trials in the Go/NoGo task would induce emotions. Skin conductance responses (SCRs), corrugator muscle activity, and startle blink responses were measured in three experiments requiring speeded Go responses intermixed with NoGo trials of different relative probability and in a choice reaction experiment serving as a control. NoGo trials affected several of these emotion-sensitive indicators as SCRs and startle blinks were reduced whereas corrugator activity was prolonged as compared to Go trials. From the pattern of findings we suggest that NoGo conflicts are not aversive. Instead, they appear to be appraised as obstructive for the response goal and as less action relevant than Go trials.
-
It has been suggested that cognitive conflicts require effortful processing and, therefore, are aversive. In the present study, we compared conflicts emerging from the inhibition of a predominant response tendency in a go/no-go task with those between incompatible response activations in a Simon task in a within-subjects design, using the same type of stimuli. Whereas no-go trials elicited reduced skin conductance and pupillometric responses, but prolonged corrugator muscle activity, as compared with go trials, incompatible and compatible Simon trials were indistinguishable with respect to these parameters. Furthermore, the conflict-sensitive N2 components of the event-related brain potential were similar in amplitude, but showed significantly different scalp distributions, indicating dissociable neural generator systems. The present findings suggest the involvement of different emotional and cognitive processes in both types of cognitive conflicts-none being aversive, however. In addition, the N2 findings call into question claims of common monitoring systems for all kinds of cognitive conflicts.
-
Abstract: Pupillary responses have been shown to be sensitive to both task load and emotional content. We investigated the interplay of these factors in the processing of single words that varied in emotional valence and arousal. Two tasks of different cognitive load, uninstructed reading and a lexical decision task, were employed, followed by an unannounced recognition task. Reaction times were faster and incidental memory performance was better for high‐arousing than for low‐arousing words. In contrast to previous findings for pictures and sounds, high‐arousing words elicited smaller pupillary responses than low‐arousing words; these effects were independent of task load, which increased pupil diameter. Therefore, emotional arousal attributed to words does not mandatorily activate the autonomic nervous system, but rather works on a cognitive level, facilitating word processing.
-
Using a questionnaire, 98 music experts were asked to report on their affective, cognitive, and physiological reactions to a piece of music they recently heard and that struck them as having produced an emotional response. In addition, participants were also asked to rate the relative importance of a list of musical and extramusical features that could have contributed to their reactions. A coding system was developed to organize and quantify the freely reported reactions. With respect to bodily symptoms, the most frequent reactions included semi-physiological variables such as tears and shivers, cardiovascular symptoms, and incitement to motor action such as jumping or dancing. With respect to subjective experiences or feelings, reports such as feeling nostalgic, charmed, moved, or aroused were more frequent than reports of 'basic' emotions such as sadness, anger, joy, or fear. Musical structure was given the highest rating of the list of potential determinants, but technical, acoust
-
We investigated whether face-specific processes as indicated by the N170 in event-related brain potentials (ERPs) are modulated by emotional significance in facial expressions. Results yielded that emotional modulations over temporo-occipital electrodes typically used to measure the N170 were less pronounced when ERPs were referred to mastoids than when average reference was applied. This offers a potential explanation as to why the literature has so far yielded conflicting evidence regarding effects of emotional facial expressions on the N170. However, spatial distributions of the N170 and emotion effects across the scalp were distinguishable for the same time point, suggesting different neural sources for the N170 and emotion processing. We conclude that the N170 component itself is unaffected by emotional facial expressions, with overlapping activity from the emotion-sensitive early posterior negativity accounting for amplitude modulations over typical N170 electrodes. Our findings
-
Semantic knowledge is thought to be at least partially grounded in sensory, motor, and affective information, acquired through experiences in our inner and outer world. The reactivation of experience‐related information during meaning access is called simulation. In the affective simulation account, it is assumed that the grounding information depends on the concepts’ concreteness. Whereas abstract concepts are thought to be mainly represented through affective experiential information, concrete words rely more on sensory‐motor experiential information. To test this hypothesis, we measured facial muscle activity as an indicator of affective simulation during visual word recognition. Words varied on the dimensions of concreteness and valence. Behavioral and electromyographic data were analyzed with linear mixed‐effects models with maximal random effect structure to optimize generalization over participants and word samples. Contrary to this hypothesis, we found a valence effect in the m. corrugator supercilii only in response to concrete but not to abstract words. Our data show that affective simulation as measured with facial muscle activity occurs in response to concrete rather than to abstract words. More concrete words are supposed to have higher context availability and richer visual imagery, which might promote affective simulation on the expressive level of facial muscle activity. The results are in line with embodied accounts of semantic representation but speak against its predominant role for representing affective information in abstract concepts.
-
Recent evidence suggests that dynamic facial expressions of emotion unfolding over time are better recognized than static images. However, the mechanisms underlying this facilitation are unclear. Here, participants performed expression categorizations for faces displaying happy, angry, or neutral emotions either in a static image or dynamically evolving within 150 ms. Performance replicated facilitation of emotion evaluation for happy expressions in dynamic over static displays. An initial emotion effect in event-related brain potentials evidenced in the early posterior negativity (EPN) was both enhanced and prolonged when participants evaluated dynamic in comparison to static facial expressions. Following the common interpretation of the EPN, this finding suggests that the facilitation for dynamic expressions is related to enhanced activation in visual areas starting as early as 200 ms after stimulus onset, presumably due to shifts of visual attention. Enhancement due to dynamic display was also found for the late positive complex (LPC), indicating a more elaborative processing of emotional expressions under this condition at subsequent stages.
-
For visual stimuli of emotional content as pictures and written words, stimulus size has been shown to increase emotion effects in the early posterior negativity (EPN), a component of event-related potentials (ERPs) indexing attention allocation during visual sensory encoding. In the present study, we addressed the question whether this enhanced relevance of larger (visual) stimuli might generalize to the auditory domain and whether auditory emotion effects are modulated by volume. Therefore, subjects were listening to spoken words with emotional or neutral content, played at two different volume levels, while ERPs were recorded. Negative emotional content led to an increased frontal positivity and parieto-occipital negativity—a scalp distribution similar to the EPN—between 370 and 530 ms. Importantly, this emotion-related ERP component was not modulated by differences in volume level, which impacted early auditory processing, as reflected in increased amplitudes of the N1 (80–130 ms) and P2 (130–265 ms) components as hypothesized. However, contrary to effects of stimulus size in the visual domain, volume level did not influence later ERP components. These findings indicate modality-specific and functionally independent processing triggered by emotional content of spoken words and volume level.
-
Skin conductance responses (SCRs) to NoGo stimuli have been found to be smaller than to Go stimuli, possibly due to their diminished task relevance. These findings have been obtained at inter-stimulus intervals (ISI) that were unusually short for SCR recordings. Therefore, we tested whether the same findings would also hold at longer ISIs. Simultaneously, effects of ISI duration on the NoGo-N2 and-P3 components of event-related brain potentials (ERPs) were assessed. Go and NoGo stimuli were equiprobable while ISI varied between 2, 5, and 8 s. Although increasing the ISI-enhanced SCR amplitudes in general, it did not modulate the attenuation of the response to NoGo relative to Go stimuli. When considered as difference between NoGo and Go conditions, neither the NoGo-N2 nor the NoGo-P3 was affected by ISI variation. Together, these data confirm the feasibility of co-registering ERPs and SCRs.
-
We assessed the automaticity of emotional face processing with respect to the intentionality criterion, holding that automatic processes are triggered independently of intention. For this purpose, we observed emotion processing in event-related brain potential (ERP) components under five different task conditions. ERP components included the P1, N170, the early posterior negativity (EPN), and the late positive complex (LPC). Enhanced processing at perceptual stages as indicated by P1, N170, and EPN effects occurred independently of intention in angry expressions. In contrast, the emotion-related LPC, a putative manifestation of higher-level, more elaborative processing stages, depended on the intentional state of the participants. This suggests an automatic threat-related processing bias at perceptual stages, while higher cognitive emotion encoding is subject to voluntary control. Moreover, an independent component analyses (ICA) showed that EPN and LPC activity occurred simultaneously, indicating perceptual and higher cognitive emotion encoding to occur in parallel.
-
The late positive potential (LPP) elicited by affective stimuli in the event-related brain potential (ERP) is often assumed to be a member of the P3 family. The present study addresses the relationship of the LPP to the classic P3b in a published data set, using a non-parametric permutation test for topographical comparisons, and residue iteration decomposition to assess the temporal features of the LPP and the P3b by decomposing the ERP into several component clusters according to their latency variability. The experiment orthogonally manipulated arousal and valence of words, which were either read or judged for lexicality. High-arousing and positive valenced words induced a larger LPP than low-arousing and negative valenced words, respectively, and the LDT elicited a larger P3b than reading. The experimental manipulation of arousal, valence, and task yielded main effects without any interactions on ERP amplitude in the LPP/P3b time range. The arousal and valence effects partially differed from the task effect in scalp topography; in addition, whereas the late positive component elicited by affective stimuli, defined as LPP, was stimulus-locked, the late positive component elicited by task demand, defined as P3b, was mainly latency-variable. Therefore LPP and P3b manifest different subcomponents.
-
Relatively little is known regarding the broad factor of correct decision speed (CDS), which is represented in the theory of fluid and crystallized intelligence. The current study (N = 186) examined the possibility that distinct CDS factors may exist that are specific to the broad ability assessed by the tasks from which the correct response latencies are derived, in this instance fluid and crystallized intelligence (Gf and Gc) tasks. Additionally, the relationships between the correct response latencies and Gf, Gc, and processing speed (Gs) were investigated. Two distinct yet correlated factors of CDS were identified for Gf and Gc tasks, respectively. Both CDS factors were related to their ability factor counterparts, and CDSGc was lowly related to Gs. However, item difficulty moderated the relationships between CDS and the abilities. When item difficulty was considered relative to groups of participants differing in ability level, differences in the speed of responses were found amo