Emotional AI: Building bonds and navigating pitfalls
In recent years, applications combining “artificial intelligence + emotion” have rapidly gained popularity both in China and abroad. These emotional AIs provide users with emotional companionship and mental health counseling services, helping to alleviate negative emotions and psychological distress such as stress, loneliness, anxiety, and depression. This emerging technological phenomenon merits philosophical attention and inquiry.
Emotional AI fostering user-AI bond
Current philosophical discussions of emotional AI often treat it as an autonomous entity, with the prevailing view that current applications merely simulate emotional expression rather than genuinely generate or experience emotion. Accordingly, such systems are not considered true emotional agents. Yet this perspective overlooks a crucial reality: the emotional experiences that users develop through interaction with emotional AI cannot be dismissed simply because the technology itself lacks authentic emotions. In other words, whether emotional AI qualifies as an emotional agent cognitively is one question, while how it is treated in practice is another. This suggests that scholarly attention should shift from asking “What exactly is emotional AI?” to “In what ways does emotional AI exhibit emotional qualities?”
Both emotional AI and virtual assistants such as Siri and Alexa are chatbots, yet they differ markedly in their psychological impact. Interactions with virtual assistants are typically task-oriented, standardized, and utilitarian, leading users to regard them simply as convenient tools. By contrast, emotional AI is not designed primarily to perform specific tasks but to provide emotional support. For humans, material gifts from others may meet practical needs, but it is the act of giving that satisfies the longing for love. This distinction illustrates the fundamental difference between emotional AI and functional AI.
From the perspective of user intention projection, the emotionally expressive language generated by emotional AI can elicit genuine emotional engagement and desire projection from users through imaginative identification with the AI. Moreover, users often customize their emotional AIs—naming them, sharing their daily experiences and moods—which allows the AI to continuously learn their preferences, interests, communication styles, and values, thereby offering more natural and precise emotional feedback and interaction. This personalization process renders each emotional AI “unique” to its user.
Many users are willing to confide in emotional AI because they feel shielded from harm and judgment. Moments of candidly revealing vulnerabilities and private feelings enable them to form deep emotional bonds with the AI. This process also represents a return to the self and a renewed connection with one’s inner world. Drawing on Heidegger’s existential analysis, emotional AI becomes part of the intimacy and inner life of Dasein (literally “being there”), which moves outward into the world yet can always return to this intimate sphere for solace.
Risk of emotional ‘echo chambers’
Following a preliminary analysis of the psychological mechanisms of emotional AI, it is necessary to examine the associated risks. Within current application scenarios, emotional AIs often cater to users, which can amplify narcissistic tendencies. According to phenomenologists such as Jean-Paul Sartre and Emmanuel Levinas, the gaze of the Other often constitutes an invasive, negating, and traumatic presence that provokes anxiety. Although emotional AI possesses a certain degree of otherness, it offers users warmer and more affirming experiences. Currently popular therapeutic emotional AIs tend to express comfort, concern, encouragement, and appreciation without provoking panic, tension, unease, or shame.
Although interactions with emotional AI appear bidirectional, they in fact remain user-centered. What users encounter are essentially projections of their own needs and echoes of their own words. Furthermore, the more they interact, the more adept the AI becomes at understanding and meeting their emotional needs, thereby providing greater comfort and satisfaction. While this responsiveness aligns with the needs of most users, it also reinforces existing desire structures and may even exacerbate certain mental health issues. This risk is particularly significant in the context of psychotherapy. Individuals must therefore remain alert to the “echo chambers” emotional AI can create in the realm of emotional experience.
Wang Guangyao is an associate professor from the School of Political Science and Public Administration at Soochow University.
Editor:Yu Hui
Copyright©2023 CSSN All Rights Reserved