The way we interpret other people's emotions doesn't depend solely on what we see, but also on what we think. Strategies such as autosuggestion (mentally repeating phrases to shape the perception of emotions in a person) and mental imagery (recreating sensory images in the absence of real stimuli) are simple, flexible, and can be applied without external help.
In a study led by Elena Azañón, experiments were conducted to investigate whether these strategies influence the perception of facial expressions. In two experiments, participants were asked to mentally repeat phrases like “she is happy” while observing a neutral face (autosuggestion). In another experiment, they imagined that same face as being happy or sad (mental imagery). Afterwards, they evaluated new faces and, consistently, participants tended to perceive the subsequent faces as happier or sadder depending on the emotion they had previously suggested or imagined. Autosuggestion showed more marked and consistent effects.
Interestingly, when researchers simply presented words like “happy” or “sad” next to the neutral face, there was no effect. This indicates that a superficial association is not enough: the mind must be actively engaged for perception to truly be altered.
These results suggest that autosuggestion and mental imagery may be promising tools in psychological training and support, especially in mood disorders such as depression, where there is a tendency to interpret expressions negatively. This study was published in the scientific journal Cognition, in the article Autosuggestion and mental imagery bias the perception of social emotions, as a part of research project 296/18 - The power of mind: Altering cutaneous sensations by autosuggestion, supported by the Bial Foundation.
ABSTRACT
Cognitive processes that modulate social emotion perception are of pivotal interest for psychological and clinical research. Autosuggestion and mental imagery are two candidate processes for such a modulation, however, their precise effects on social emotion perception remain uncertain. Here, we investigated how autosuggestion and mental imagery, employed during an adaptation period, influence the subsequent perception of facial emotions, and to which extent. Separate cohorts of participants took part in five experiments, where they either mentally affirmed (autosuggested; Experiments 1a and 1b) or imagined (Experiment 2) that a neutral face would be expressing a specific emotion (happy or sad). Subsequent facial emotion perception was then assessed by calculating points of subjective equality (PSEs) along a happiness-sadness continuum. Our results show that both autosuggestion and mental imagery induce a bias towards perceiving facial emotions in the direction of the desired emotion, with larger Bayes factors supporting autosuggestion. Experiment 3 confirmed the absence of effects when emotional words were presented together with a neutral face, suggesting a limited role of response bias in driving this effect. Finally, experiment 4 validated the experimental setup by demonstrating standard contrastive aftereffects when participants were adapted to actual, physical emotional faces. Together, our findings provide an initial step towards understanding the potential of intentional cognitive processes to modulate social emotions, specifically by biasing emotional face perception. With comparable effect sizes observed for both autosuggestion and mental imagery, both strategies show promise for self-directed interventions. Their practical applicability may vary due to individual responses, preferred cognitive strategies, and potential overlaps in underlying cognitive mechanisms.