Geänderte Inhalte

Alle kürzlich geänderten Inhalte in zeitlich absteigender Reihenfolge
  • Priming Interpretations: Contextual Impact on the Processing of Garden Path Jokes

    In garden path (GP) jokes, a first dominant interpretation is detected as incoherent and subsequently substituted by a hidden joke interpretation. Two important factors for the processing of GP jokes are salience of the initial interpretation and accessibility of the hidden interpretation. Both factors are assumed to be affected by contextual embedding. We investigated this contextual impact with a priming manipulation in a two self-paced reading experiments (Ns = 45/44). Words that were semantically related to the initial interpretation, to the hidden interpretation, or nonrelated to both served as priming cues before the joke presentation. Only priming semantic content of the hidden interpretation reduced reading times of the final word of the GP joke compared with both other conditions, indicating facilitated detection and revision of the otherwise incoherent discourse. The results highlighted the impact of subtle contextual cues affecting the reinterpretation stage in GP joke comprehension.

  • The valence of food in pictures and on the plate: impacts on brain and body

    Food and food-related stimuli can be powerful elicitors of affect. Here we investigated how the valence of food pictures and the consummation of meals with different arrangements on the plate modulate physiological and subjective variables. In particular, we presented pictures of food stimuli rated as high or average in terms of valence, before and after participants consumed strictly standardized meals. The meals consisted either of several separate constituents on the plate (complex meal), or of exactly the same constituents mixed together (simple meal). Food pictures of positive or neutral valence had to be discriminated from randomly intermixed pictures of faces showing either happy or neutral expressions. During the complex meal, blood glucose increased more slowly than during the simple meal, indicating a beneficial effect of the former, worthy of further investigation because more rapid changes in glucose level are considered to be related to more intense food craving. For the food stimuli emotion effects were present both very early on and in the late positive complex. Hence, food pictures may elicit a rapid reflex-like affective response followed by a later evaluative stage. Interestingly, the valence-related ERP responses to food pictures seemed to be relative independent from motivational states, that is, satiation because the intervening meals did not modulate them.

  • Motivational salience modulates early visual cortex responses across task sets

    Motivationally relevant stimuli benefit from strengthened sensory processing. It is unclear, however, if motivational value of positive and negative valence has similar or dissociable effects on early visual processing. Moreover, whether these perceptual effects are task-specific, stimulus-specific, or more generally feature-based is unknown. In this study, we compared the effects of positive and negative motivational value on early sensory processing using ERPs. We tested the extent to which these effects could generalize to new task contexts and to stimuli sharing common features with the motivationally significant ones. At the behavioral level, stimuli paired with positive incentives were learned faster than stimuli paired with neutral or negative outcomes. The ERP results showed that monetary loss elicited higher neural activity in V1 (at the C1 level) compared with reward, whereas the latter influenced postperceptual processing stages (P300). Importantly, the early loss-related effect generalized to new contexts and to new stimuli with common features, whereas the later reward effects did not spill over to the new context. These results suggest that acquired negative motivational salience can influence early sensory processing by means of plastic changes in feature-based processing in V1.

  • Test-retest reliability of the N400 component in a sentence-reading paradigm

    The N400 component of the event-related potential is considered an index of semantic processing and therefore may be an ideal biomarker of semantic system disorders or individual differences. To this purpose, it is necessary to assess its test–retest reliability. Only one previous study has addressed this question, reporting good test–retest reliability ( r  = 0.85). However, that study had used a word-pair priming paradigm, which differs in many respects from the more typical and ecologically valid sentence-reading. The present study surveys test–retest reliability of the N400 in a sentence-reading paradigm. The best value obtained was  r  = 0.63, implying a relatively poor test–retest reliability. Crucial factors for this result may be the long interval between context and critical word as well as more complex contexts in sentence-reading paradigms. These factors might make the N400 effects in sentences more vulnerable to linguistic and non-linguistic factors increasing the variance across sessions.

  • The impact of personal relevance on emotion processing: Evidence from event-related potentials and pupillary responses
  • Associated motivational salience impacts early sensory processing of human faces

    Facial expressions of emotion have an undeniable processing advantage over neutral faces, discernible both at behavioral level and in emotion-related modulations of several event-related potentials (ERPs). Recently it was proposed that also inherently neutral stimuli might gain salience through associative learning mechanisms. The present study investigated whether acquired motivational salience leads to processing advantages similar to biologically determined origins of inherent emotional salience by applying an associative learning paradigm to human face processing. Participants (N=24) were trained to categorize neutral faces to salience categories by receiving different monetary outcomes. ERPs were recorded in a subsequent test phase consisting of gender decisions on previously associated faces, as well as on familiarized and novel faces expressing happy, angry or no emotion. Previously reward-associated faces boosted the P1 component, indicating that acquired reward-associations modulate early sensory processing in extrastriate visual cortex. However, ERP modulations to emotional - primarily angry - expressions expanded to subsequent processing stages, as reflected in well-established emotion-related ERPs. The present study offers new evidence that motivational salience associated to inherently neutral stimuli can sharpen sensory encoding but does not obligatorily lead to preferential processing at later stages.

  • Toward a comprehensive test battery for face cognition: Assessment of the tasks.

    Despite the importance of face recognition in everyday life and frequent complaints about its failure, there is no comprehensive test battery for this ability. As a first step in constructing such a battery, we present 18 tasks aimed at measuring face perception, face learning, face recognition, and the recognition of facially expressed emotions. A sample of 153 healthy young adults completed all tasks. In general, reaction time measures showed high estimates of internal consistency; tasks focused on performance accuracy yielded reliabilities that were somewhat lower, yet high enough to support their use in a battery of face cognition measures. Some of the tasks allowed computation of established experimental effects in a within-subjects design, such as the part-whole effect. Most of these experimental effects were confirmed in our large sample, and valuable effect size estimates were obtained. However, in many cases these difference measures showed poor estimates of internal consistency.

  • Time course and task dependence of emotion effects in word processing.

    The emotional content of stimuli influences cognitive performance. In two experiments, we investigated the time course and mechanisms of emotional influences on visual word processing in various tasks by recording event-related brain potentials (ERPs). The stimuli were verbs of positive, negative, and neutral valence. In Experiment 1, where lexical decisions had to be performed on single verbs, both positive and negative verbs were processed more quickly than neutral verbs and elicited a distinct ERP component, starting around 370 msec. In Experiment 2, the verbs were embedded in a semantic context provided by single nouns. Likewise, structural, lexical, and semantic decisions for positive verbs were accelerated, and an ERP effect with a scalp distribution comparable to that in Experiment 1 now started about 200 msec earlier. These effects may signal an automatic allocation of attentional resources to emotionally arousing words, since they were not modulated by different task demands.

  • The influence of emotional words on sentence processing: Electrophysiological and behavioral evidence.

    Whereas most previous studies on emotion in language have focussed on single words, we investigated the influence of the emotional valence of a word on the syntactic and semantic processes unfolding during sentence comprehension, by means of event-related brain potentials (ERP). Experiment 1 assessed how positive, negative, and neutral adjectives that could be either syntactically correct or incorrect (violation of number agreement) modulate syntax-sensitive ERP components. The amplitude of the left anterior negativity (LAN) to morphosyntactic violations increased in negative and decreased in positive words in comparison to neutral words. In Experiment 2, the same sentences were presented but positive, negative, and neutral adjectives could be either semantically correct or anomalous given the sentence context. The N400 to semantic anomalies was not significantly affected by the valence of the violating word. However, positive words in a sentence seemed to influence semantic correctness decisions, also triggering an apparent N400 reduction irrespective of the correctness value of the word. Later linguistic processes, as reflected in the P600 component, were unaffected in either experiment. Overall, our results indicate that emotional valence in a word impacts the syntactic and semantic processing of sentences, with differential effects as a function of valence and domain.

  • The coupling of emotion and cognition in the eye: Introducing the pupil old/new effect.

    The study presented here investigated the effects of emotional valence on the memory for words by assessing both memory performance and pupillary responses during a recognition memory task. Participants had to make speeded judgments on whether a word presented in the test phase of the experiment had already been presented ('old') or not ('new'). An emotion-induced recognition bias was observed: Words with emotional content not only produced a higher amount of hits, but also elicited more false alarms than neutral words. Further, we found a distinct pupil old/new effect characterized as an elevated pupillary response to hits as opposed to correct rejections. Interestingly, this pupil old/new effect was clearly diminished for emotional words. We therefore argue that the pupil old/new effect is not only able to mirror memory retrieval processes, but also reflects modulation by an emotion-induced recognition bias.

  • Test battery for measuring the perception and recognition of facial expressions of emotion.

    Despite the importance of perceiving and recognizing facial expressions in everyday life, there is no comprehensive test battery for the multivariate assessment of these abilities. As a first step toward such a compilation, we present 16 tasks that measure the perception and recognition of facial emotion expressions, and data illustrating each task's difficulty and reliability. The scoring of these tasks focuses on either the speed or accuracy of performance. A sample of 269 healthy young adults completed all tasks. In general, accuracy and reaction time measures for emotion-general scores showed acceptable and high estimates of internal consistency and factor reliability. Emotion-specific scores yielded lower reliabilities, yet high enough to encourage further studies with such measures. Analyses of task difficulty revealed that all tasks are suitable for measuring emotion perception and emotion recognition related abilities in normal populations.

  • Sex differences in face cognition.

    Although there is abundant evidence for female superiority in Face Cognition (FC), a number of questions regarding sex differences remain to be addressed. Here we report a reanalysis of data on the level of latent factors, modeled on the basis of an extensive test battery applied to three samples of over 800 adults in all. In independent samples the measurement structure of FC was invariant for both sexes, indicating that the measurement of the construct does not depend on the context variable sex, and investigating mean performance differences will not be biased by measurement issues—a neglected aspect in previous studies. We confirmed female superiority for face perception (FP) and face memory (FM). For the first time we could show that these sex differences prevailed after accounting for sex differences in broadly measured general cognitive functioning and in object perception. Across adult age, sex differences in FM increased due to the rapid decline of this ability in men, whereas performance in women remained stable across adult age. Self-reported social involvement and things-oriented activities moderated sex-differences in FM. Results show that sex differences are salient at the level of specific FC constructs and that they can be partially explained by social involvement.

  • Rules and heuristics during sentence comprehension: Evidence from a dual-task brain potential study.

    Whether syntactic and semantic processes during sentence comprehension follow strict sets of rules or succumb to context-dependent heuristics was studied by recording event-related potentials in a dual-task design. In one condition, sentence-extraneous acoustic material was either semantically congruent or incongruent relative to an adjective in the visually presented sentence, the latter being either semantically correct or incorrect within the sentence context. Homologous syntactic (gender) manipulations were performed in another condition. Syntactic processing within the sentence appeared to be blind to the syntactic content of the second task. In contrast, semantically incongruous material of the second task induced fluctuations typically associated with the detection of within-sentence semantic anomalies (N400) even in semantically correct sentences. Subtle but extant differences in topography between this N400 and that obtained with within-sentence semantic violations add to recent proposals of separate semantic subsystems differing in their specificity for sentence structure and computational procedures. Semantically incongruous material of the second task also influenced later stages of the processing of semantically incorrect adjectives (P600 component), which are traditionally assumed to pertain to the syntactic domain. This result is discussed in the light of current proposals of a third combinatorial stream in sentence comprehension.

  • Recognizing dynamic facial expressions of emotion: Specificity and intensity effects in event-related brain potentials.

    Emotional facial expressions usually arise dynamically from a neutral expression. Yet, most previous research focused on static images. The present study investigated basic aspects of processing dynamic facial expressions. In two experiments, we presented short videos of facial expressions of six basic emotions and non-emotional facial movements emerging at variable and fixed rise times, attaining different intensity levels. In event-related brain potentials (ERP), effects of emotion but also for non-emotional movements appeared as early posterior negativity (EPN) between 200 and 350ms, suggesting an overall facilitation of early visual encoding for all facial movements. These EPN effects were emotion-unspecific. In contrast, relative to happiness and neutral expressions, negative emotional expressions elicited larger late positive ERP components (LPCs), indicating a more elaborate processing. Both EPN and LPC amplitudes increased with expression intensity. Effects of emotion and intensity were additive, indicating that intensity (understood as the degree of motion) increases the impact of emotional expressions but not its quality. These processes can be driven by all basic emotions, and there is little emotion-specificity even when statistical power is considerable (N (Experiment 2)=102).

  • Reading emotional words within sentences: The impact of arousal and valence on event-related potentials.

    Effects of emotional word meaning have been studied exclusively for words in isolation but not in the context of sentences. We addressed this question within the framework of two-dimensional models of affect, conceiving emotion as a function of valence and arousal. Negative and neutral target verbs, embedded within sentences, were presented while event-related brain potentials (ERPs) and the activity of the Corrugator muscle were recorded. Twenty-one participants performed a semantic decision task on the target verbs. In contrast to single word studies no early posterior negativity was present. However, emotion effects in ERPs were evident in a late positive complex (LPC) for negative, high-arousal words in comparison to neutral words. Interestingly, the LPC was unaffected by pure arousal variation when valence was controlled for, indicating the importance of valence for this emotion-related ERP effect.

  • Perceiving and remembering emotional facial expressions—A basic facet of emotional intelligence.

    Decoding the meaning of facial expressions is a major pathway of human communication and has been extensively studied as a basic facet of emotional intelligence. In order to better understand the structure and specificity of the abilities subsumed under emotion decoding from faces (facial emotion perception and facial emotion recognition), the multivariate measurement of individual differences is essential. In the present study, we focused on the abilities to perceive and recognize facial expressions of emotions and investigated their internal structure and nomological net. N = 269 participants with a heterogeneous educational background completed a large test battery including multiple assessment paradigms substantiated in basic experimental research. Results allowed establishing task-general measurement models of facial emotion perception (EP) and recognition (ER). In these measurement models emotion category-related specificity was negligible. The most important conclusion from the present study is the strongly limited specific variance in perceptual performance of certain emotion related facial expressions and emotion decoding from faces in general, relative to face identity processing and fluid cognitive abilities (figural reasoning, working memory and immediate and delayed memory). We discuss implications of the present results for building the nomological net of emotional intelligence and outline desiderata for future research.

  • P1 and beyond: Functional separation of multiple emotion effects in word recognition.

    Event‐related potentials (ERPs) revealed effects of emotional meaning on word recognition at distinguishable processing stages, in rare cases even in the P1 time range. However, the boundary conditions of these effects, such as the roles of different levels of linguistic processing or the relative contributions of the emotional valence and arousal dimensions, remain to be fully understood. The present study addresses this issue by employing two tasks of different processing demands on words that orthogonally varied in their emotional valence and arousal. Effects of emotional valence in ERPs were evident from 100 ms after word onset and showed a task-insensitive processing advantage for positive words. Early posterior negativity (EPN) effects to high-arousing words were limited to the lexical decision task, corroborating recent reports that suggested that perceptual processing as reflected in the EPN might not be as automatic as previously assumed.

  • Overcoming limitations of the ERP method with residue iteration decomposition (RIDE): A demonstration in go/no‐go experiments.

    The usefulness of the event-related potential (ERP) method can be compromised by violations of the underlying assumptions, for example, confounding variations of latency and amplitude of ERP components within and between conditions. Here we show how the ERP subtraction method might yield misleading information due to latency variability of ERP components. We propose a solution to this problem by correcting for latency variability using Residue Iteration Decomposition (RIDE), demonstrated with data from representative go/no-go experiments. The overlap of N2 and P3 components in go/no-go data gives rise to spurious topographical localization of the no-go–N2 component. RIDE decomposes N2 and P3 based on their latency variability. The decomposition restored the N2 topography by removing the contamination from latency-variable late components. The RIDE-derived N2 and P3 give a clearer insight about their functional relevance in the go/no-go paradigm.

  • On the automaticity of emotion processing in words and faces: Event-related brain potentials evidence from a superficial task.

    The degree to which emotional aspects of stimuli are processed automatically is controversial. Here, we assessed the automatic elicitation of emotion-related brain potentials (ERPs) to positive, negative, and neutral words and facial expressions in an easy and superficial face-word discrimination task, for which the emotional valence was irrelevant. Both emotional words and facial expressions impacted ERPs already between 50 and 100 ms after stimulus onset, possibly reflecting rapid relevance detection. Following this initial processing stage only emotionality in faces but not in words was associated with an early posterior negativity (EPN). Therefore, when emotion is irrelevant in a task which requires superficial stimulus analysis, automatically enhanced sensory encoding of emotional content appears to occur only for evolutionary prepared emotional stimuli, as reflected in larger EPN amplitudes to faces, but not to symbolic word stimuli.

  • Measuring the speed of recognising facially expressed emotions.

    Faces provide identity- and emotion-related information—basic cues for mastering social interactions. Traditional models of face recognition suggest that following a very first initial stage the processing streams for facial identity and expression depart. In the present study we extended our previous multivariate investigations of face identity processing abilities to the speed of recognising facially expressed emotions. Analyses are based on a sample of N = 151 young adults. First, we established a measurement model with a higher order factor for the speed of recognising facially expressed emotions (SRE). This model has acceptable fit without specifying emotion-specific relations between indicators. Next, we assessed whether SRE can be reliably distinguished from the speed of recognising facial identity (SRI) and found latent factors for SRE and SRI to be perfectly correlated. In contrast, SRE and SRI were both only moderately related to a latent factor for the speed of recognising non-face stimuli (SRNF). We conclude that the processing of facial stimuli—and not the processing of facially expressed basic emotions—is the critical component of SRE. These findings are at variance with suggestions of separate routes for processing facial identity and emotional facial expressions and suggest much more communality between these streams as far as the aspect of processing speed is concerned.