📝 New paper! ‘No words for feelings’ – Alexithymia affects social brain activity during emotion recognition in autistic adults

Emotion Recognition and Autism

Emotion recognition experiments are a classic in autism research. The idea behind it is quite simple: recognizing emotions is an important prerequisite for understanding other people. Problems in recognizing emotions could therefore be a possible explanation for many autistic characteristics in social interaction. Decades of research have produced a clear picture: ‘It’s complicated’. On average, autistic people have more problems recognizing emotional facial expressions than non-autistic people. This means that if we compare a large group of autistic people with a large group of non-autistic people, we are very likely to find a (significant) difference in the mean score of the two groups. However, this does not mean that all autistic people have problems recognizing emotions per se. The actual picture is much more mixed. There are autistic people with severe emotion recognition problems, those with mild emotion recognition problems and those who are very good at recognizing emotions. There are also people in the non-autistic population with better and worse emotion recognition skills.

The Alexithymia Hypothesis

So if emotion recognition problems are not a primary characteristic of autism, the question arises as to why they occur so frequently in autistic people and whether there could be another cause. One possible explanation is provided by the ‘alexithymia hypothesis’. Alexithymia is a personality trait that literally means ‘no words for feelings’. People with heightened alexithymia have difficulty recognizing and describing their own feelings and tend to focus on things they can see or touch rather than on their own emotional experiences, or those of other people. The degree of a person’s alexithymia can be measured with various questionnaires, e.g. as in our study with the Toronto Alexithymia Scale-26. Alexithymia itself is not a medical diagnosis, but is considered a risk factor for various mental illnesses such as depression or anxiety disorders. High levels of alexithymia are also relatively common in the autistic population. While the probability of increased alexithymia in the general population is around 5 per cent, this figure is around 50 per cent in the autistic population.

Various studies show that alexithymia is also associated with problems recognizing emotions in other people. The alexithymia hypothesis now assumes that problems in recognizing emotions in autistic people can be attributed to the presence of co-occurring alexithymia. We have investigated this hypothesis for the first time by combining a behavioral experiment with the simultaneous measurement of brain activity (functional magnetic resonance imaging = fMRI). In addition to the behavioral data, fMRI can provide information about the underlying processing. This means that we not only see how well or poorly our test subjects solve the task, but also whether there are differences in the activity of certain brain areas associated with specific processing processes, such as visual attention, face processing or social cognition. With the help of functional magnetic resonance imaging, processes of ‘implicit’ emotion processing can also be distinguished from those of ‘explicit’ emotion processing. By implicit emotion processing, we mean processes that run unconsciously and automatically as soon as we perceive an emotion, regardless of whether we are focussing on it or not. Explicit emotion processing, on the other hand, only takes place if we consciously engage with the perceived emotions or if our attention is directed to them from outside.

The Emotion Recognition Task

In autism research, there is still disagreement about the extent to which the emotion recognition difficulties of autistic people (if present) are due to problems with implicit (unconscious) or explicit (conscious) emotion recognition. Furthermore, the alexithymia hypothesis has never been investigated in the context of implicit emotion processing. For this reason, we presented three different tasks in our experiment: Sex recognition for neutral facial expressions (control condition), sex recognition for emotional facial expressions (implicit emotion recognition) and emotion recognition for emotional facial expressions (explicit emotion recognition). These three conditions make it possible to consider the different processes we assume in the task separately: in the control condition and the implicit emotion recognition condition, the subjects have the same task (“What sex is the person?”). The only difference between the conditions is the expression of the faces (neutral or emotional). This means that all differences in brain activity between the conditions can be attributed to the implicit processing of emotions. In contrast, the only difference between explicit and implicit emotion recognition is the task: in both conditions, subjects see emotional faces. In one condition they are asked to determine the sex, in the other the emotion. All differences in brain activity between these tasks can therefore be attributed to explicit emotion recognition (for a visual representation of the task and the experimental procedure, see Figure 1).

Figure 1. Procedure of the experiment. The different conditions were presented in blocks of 8 faces each (Geschlechtserkennung = control condition, Implizite Emotionserkennung = implicit emotion recognition, Explizite Emotionserkennung = explicit emotion recognition). At the beginning of each block, the subjects were presented with the task. The faces were then shown one after the other together with the possible answers. The blocks were played in random order. There was a longer rest period between each block. This is necessary to avoid possible carry-over effects of prolonged activation and to measure brain activity in the resting state.

What we expected

When planning the experiment, we formulated various predictions (hypotheses) for our results. Formulating hypotheses is an important step in research to ensure that the results are interpreted as objectively as possible and that the theories are not arbitrarily adapted to the results that have just been found (for a visual representation of the problem, see Figure 2). At the behavioural level, we assumed that autistic subjects make more errors in explicit emotion recognition than non-autistic subjects and also need more time for their responses. We did not expect any differences in response behaviour in the implicit emotion recognition and control condition. Furthermore, according to the alexithymia hypothesis, we expected emotion recognition problems to be particularly evident in autistic subjects with increased alexithymia.  At the level of brain function, we also expected differences between autistic and non-autistic subjects, as well as differences related to the presence of increased alexithymia. We expected these differences mainly in the face processing system and in different regions of the social brain (see original publication for more information).

Figure 2. Illustration of why hypotheses are important. ‘It’s easy to hit your target, if you pick the target after you shoot. But you don’t learn anything that way.’ (https://jaydaigle.net/blog/hypothesis-testing-part-3/). Illustration by Dirk-Jan Hoek, CC-BY

The Study

For our study, we analyzed the data of 120 adult participants. Half of the subjects had an autism diagnosis, the other half formed the non-autistic comparison group. To investigate the alexithymia hypothesis, the autism group was additionally divided into a group with and a group without co-occurring alexithymia. The autistic participants repeatedly performed various measurements and behavioral tasks inside and outside the MRI scanner on two measurement dates as part of the FASTER/SCOTT ancillary study. For the results presented here, only the data from the first measurement date were analyzed (more information on the FASTER/SCOTT ancillary study here). The test subjects lay in the MRI scanner during the task and their responses were recorded using a response pad. The response options were ‘male’ or ‘female’ in the conditions with gender recognition and ‘sad’ or ‘fearful’ in the explicit emotion recognition condition.

What we found

Our behavioral data show that autistic subjects make more errors on average when judging emotional facial expressions. The analysis of response times also suggests that the problems may already begin in the early processing phases of facial recognition, as autistic subjects are already slower on average than non-autistic subjects in gender recognition. Contrary to our expectations, there is no correlation between co-occurring alexithymia and emotion recognition problems in response behavior.

Figure 3. A) Illustration of response times of the participants divided according to group membership and condition. Each point represents the average response time of a subject within a condition. The boxes (boxplots) provide information on the distribution of the values. Asterisks indicate significant group differences. B) Illustration of the response accuracy of the test subjects divided according to group membership and condition.
ASD+ = autism group with co-occurring alexithymia, ASD- = autism group without co-occurring alexithymia, NC- = non-autistic comparison group without alexithymia. Explicit = Explicit (conscious) emotion recognition, Implicit = Implicit (unconscious) emotion recognition, Neutral = Neutral face control condition.

At the level of brain function, we find an opposing picture: a comparison of the brain activity of autistic and non-autistic participants during the task reveals no significant differences. The comparison of autistic subjects with and without co-occurring alexithymia, on the other hand, shows that alexithymia significantly influences brain activity during emotion recognition.

Figure 4. A) Illustration of the significant differences in activation between autistic subjects with and without co-occurring alexithymia during explicit emotion recognition. All differences found show lower activation in subjects with alexithymia. MTG = Medial Temporal Gyrus, IPG = Inferior Parietal Gyrus. B) Illustration of the significant differences in activation between autistic subjects with and without co-occurring alexithymia during implicit emotion recognition. All differences found show a higher activation in subjects with alexithymia.
PCG = Precentral Gyrus, TPJ = Temporoparietal Junction.
ASD+ = autism group with concomitant alexithymia, ASD- = autism group without concomitant alexithymia. IMP = Implicit emotion recognition, EXP = Explicit emotion recognition, NEU = Neutral face control condition.

In summary, our results show, as is so often the case in research: ‘It’s complicated’. The response behavior of our subjects seems to contradict the alexithymia hypothesis, while the analysis of brain activity indicates an influence of alexithymia on emotion processing. We interpret the results such that accompanying alexithymia influences the emotion processing of autistic individuals, but cannot be used as the sole explanation for the existing problems in emotion recognition. Another possible explanation for emotion recognition problems in autistic people could be fundamental difficulties in facial perception. This is indicated by the differences in response speed in the control condition. Furthermore, this assumption could also explain why we found no differences between autistic and non-autistic subjects at the level of brain function: our task was not designed to detect changes in general face perception. This would have required another control condition without faces. In addition, the brain functional differences found in connection with alexithymia point to different ‘solution strategies’. These assumed strategies seem to lead to a similar result, at least in our experiment, as the response behavior of autistic subjects with and without alexithymia was comparable.

Open Questions

Our results could neither confirm nor refute the alexithymia hypothesis. However, they raise exciting questions that can be specifically investigated in further studies:

  • How are difficulties in face recognition and emotion recognition related in autism?
  • And if the correlation is confirmed: What factors increase the likelihood of face recognition problems?
  • Do autistic people with alexithymia actually use other strategies to recognise emotions? And if so:
    • How do these strategies differ from others?
    • Are these strategies effective in everyday situations?

By answering these questions, it may be possible to develop more targeted support services for autistic people in the future.

Strengths and Limitations of Our Study

Of course, our study, like any other, has its weaknesses and limitations. For example, our study only looked at autistic people without intellectual disability. Our conclusions cannot therefore be applied to the whole autism spectrum. The emotion recognition task in the scanner was also very artificial and far from realistic. We were therefore unable to answer the question of how far the recognition difficulties we found in the scanner could be transferred to social interactions in everyday life. Similarly, we cannot say anything about alexithymia in general, but only about alexithymia associated with autism, because we did not have an alexithymic, non-autistic comparison group. Other limitations of a more technical and methodological nature can be found in the original article.

A major strength of our study is the large sample size compared to similar studies. As our study is an ancillary project to the largest controlled and randomized psychotherapy study to date in adults with autism, the clinical and psychometric assessment of the autistic subjects meets the highest quality standards.

We would like to thank all participants who made our research possible through their great commitment.

You can find the original article here (Open Access publication).

Post written by Simon Kirsch

The Surprising Benefit of Helping Someone in Distress

Article in “Psychology Today”: The Surprising Benefit of Helping Someone in Distress

Have you ever watched a horror-movie alone? Surely most people would rather watch such a movie with their partner or a good friend. Of course, it’s reassuring to have a companion, who can calm you down in stressful situations. But the effect might also work the other way around: Comforting another person can help reducing your own distress.

In an exciting fMRI experiment, Simón Guendelmann investigated the benefits of regulating a partner’s emotions and how regulating others’ emotions differs from regulating one’s own emotions in the brain. An article on the “Psychology Today” website features this research and gives a great overview of the study.

You can find the “Psychology Today” article here and the original research paper in “NeuroImage” here.

📝 Subjective and objective difficulty of emotional facial expression perception from dynamic stimuli

You can find the original article here (open access).

Is it difficult to read emotions? It can be. Is it always equally difficult? No. Why? That was our question in our study.

Background

For some people reading people’s emotional expressions is easier that for others, and that varies in different situations. But why? Is it about the observer? The person showing the expression? The emotion itself? Or, maybe it’s an interplay of all those?

We asked these questions by investigating how the following influence difficulty of emotion perception:

  • observer’s age and (self-reported) sex,
  • actor’s age and sex,
  • valence (positive/negative) and arousal of the displayed emotion

Why and how?

Hey, aren’t there plenty of papers about it already? Yes, there’s a ton of emotion 𝘳𝘦𝘤𝘰𝘨𝘯𝘪𝘵𝘪𝘰𝘯 papers. They taught us a lot, but one problem is that they assume a “ground truth” – the correct answer. E.g., if you have to label the emotion of the person in the following image, what would it be?

Whatever you just thought, your answer would be correct if it matches the pre-established label for it in a study. What is it? Usually the actor’s intention. But what if the actor intended “puzzled” and all participants say it’s “surprised”? Are they all wrong? Well, it’s difficult.

We were interested in the 𝘥𝘪𝘧𝘧𝘪𝘤𝘶𝘭𝘵𝘺 𝘰𝘧 𝘱𝘦𝘳𝘤𝘦𝘱𝘵𝘪𝘰𝘯: how hard it is to read an emotion?Importantly, we differentiated 𝘀𝘂𝗯𝗷𝗲𝗰𝘁𝗶𝘃𝗲 (self-rated) and 𝗼𝗯𝗷𝗲𝗰𝘁𝗶𝘃𝗲 difficulty (how far off is your answer from others of similar culture and gender).

For that, we used a “multidimensional emotion perception framework”, in which 441 observers rated the perceived emotion along a number of dimensions (basic emotions + interest) instead of choosing from traditionally-used discrete categories of emotions (“happy”, “surprised”,etc).

Results

Our data showed that subjective and objective emotion perception is more difficult for:

  • older actors
  • female actors (more complex signals?)
  • female observers (less confidence and/or picking up more subtleties?)

Also, males and the youngest/the oldest participants underestimated their difficulty (subjective difficulty was smaller than the objective one).

The effects of valence/arousal were more complicated (see the figure below and check the paper), but overall stimulus-specific factors (valence and arousal) are more important for difficulty than person-specific (actor/observer age/sex) factors.

Here is the take-home message:

  • we measured difficulty of emotion perception (not recognition)
  • the new paradigm is more sensitive and captures a broader view of human emotion perception (consider the surprising higher objective difficulty for females)

Surprised? Interested? Puzzled? Get in touch, we’re happy to discuss!

Interessiert? Lesen Sie mehr!

📣 Language is action! Terminology Guideline for Autism Researchers.

The autism-research-cooperation (AFK) developed a guideline to destigmatising and inclusive use of language in autism research for our team.

The Guideline for Language Use in Autism Research were developed following the recommendations of Bottema-Beutel et al. (2021), the publication guidelines for terminology of the scientific journals Autism and Autism in Adulthood and the discussion in the Autismus-Forschungs-Kooperation (AFK). The guideline contains recommendations for the use of diagnostic terms and the designation of subgroups and comparison groups in clinical trials. Medicalised and value-laden terms should be avoided and replaced with neutral or strengths-based language.

You can find the PDF of the guideline here:
You can download the PDF below.

Interessiert? Lesen Sie mehr!

New paper! 📝 Pupillary Responses to Faces Are Modulated by Familiarity and Rewarding Context

Every day we see dozens of faces and we are experts in their processing. Faces carry a lot of information, one of which is feedback and reward for our actions. For example, when we do something and our friend smiles in response, it’s rewarding. On the other side, sometimes we see people smile, but this smile is not a response to our actions. If smiling faces are per se rewarding, we should feel rewarded in both situations. If, however, the rewarding value of faces depends on our actions, the smile is only rewarding in the first situation. Thus, in this study we compared how people process smiling faces when they serve as feedback and when they simply appear on the screen. Further, faces differ in how familiar (known, recognisable) and socially relevant (personally important) they are. We hypothesised that more familiar and relevant faces would also be more rewarding (when providing feedback). We found that 1) familiarity plays a larger role than social relevance when processing rewarding smiling faces, and that 2) smiling faces are rewards only when they are delivered in response to some actions, and not when we passively watch them on a screen.

Image by Lenka Fortelna from Pixabay 

You can find the original article here (open access, ENG).

Interested? Read more here

New paper! 📝 Multidimensional View on Social and Non-social Rewards

Social rewards are often compared in experimental designs with non-social ones: a popular pair is money (non-social) vs. a smile (social). However, we often forget that money and smiles differ on many more dimensions than just sociality. For example, money is tangible, but a smile is not. Can we then draw informative conclusions about the differences in the brain processing of social and non-social rewards? We argue that to do so, we need to use a multidimensional view on rewards.

Image by Bruno /Germany from Pixabay

You can find the original article here (open access).

Interested? Read more here

New paper! 📝 Autistic Traits Affect Reward Anticipation but not Reception

Persons with autism may be experiencing troubles interacting socially with others because of a decreased sensitivity of their brains to social stimuli (like faces, speech, gestures, etc.). Because autism is a spectrum reaching from neurotypical persons with little or no autistic traits on one end and low-functioning persons with autism on the other, we measured brain responses to social and non-social rewards in 50+ neurotypical (i.e. not diagnosed with autism) participants differing in their levels of autistic traits. Our results show that autistic traits even in neurotypical participants influence how their brains process rewards!

Image by alteredego from Pixabay 

You can find the original article here (open access).

Interested? Read more here