Why are young Europeans turning to AI chatbots for emotional support? A new survey reveals how artificial intelligence is reshaping mental health conversations among young people across Europe.
Tech Desk — May 7, 2026:
Artificial intelligence is rapidly evolving from a productivity tool into something far more personal for Europe’s younger generation: a source of emotional comfort, guidance and companionship.
A new Ipsos BVA survey conducted across France, Germany, Sweden and Ireland has revealed that nearly half of young people now feel comfortable discussing deeply personal or mental health issues with AI chatbots. The findings are raising fresh concerns among psychologists, policymakers and technology experts about the growing emotional role of artificial intelligence in everyday life.
The survey, funded by France’s privacy regulator CNIL and insurance group Groupe VYV, questioned 3,800 people aged 11 to 25 during early 2026. According to the results, 51 percent of respondents said it was easy to talk to an AI chatbot about emotional or psychological problems. By comparison, 49 percent said they felt comfortable speaking with healthcare professionals, while only 37 percent felt at ease talking to psychologists.
The numbers suggest that AI is becoming an increasingly accepted alternative for emotional expression, particularly among younger users who are already immersed in digital communication platforms.
Researchers say the appeal of AI lies largely in its accessibility and perceived lack of judgment. Unlike traditional therapy or personal conversations, AI chatbots are available at any hour, respond instantly and do not criticize users. Nearly 90 percent of survey participants said they had already used AI tools, while more than three in five users described AI as either a “life advisor” or a “trusted friend.”
At the same time, the findings expose a growing mental health crisis among European youth. Around 28 percent of participants displayed symptoms associated with generalized anxiety disorder, underscoring the increasing emotional pressures facing younger generations in a fast-changing digital and social environment.
Despite the rising popularity of AI companions, human relationships continue to play the dominant role in emotional support networks. The survey found that 68 percent of respondents still felt most comfortable discussing problems with friends, while 61 percent preferred speaking with parents.
Even so, experts warn that the growing emotional dependence on AI systems could create long-term psychological risks if left unchecked.
Ludwig Frank Foen, a psychologist and researcher at the Karolinska Institute in Stockholm, said today’s large language models are capable of producing highly convincing and empathetic responses — often sophisticated enough that even medical professionals struggle to distinguish AI-generated advice from human guidance.
However, he cautioned that AI systems are fundamentally designed to maximize engagement rather than prioritize emotional wellbeing.
“AI can provide information and support, but it can never replace human relationships or professional treatment,” Foen said. “If someone chooses a chatbot over a parent or friend, that becomes a serious social concern.”
Mental health specialists increasingly fear that emotionally vulnerable users may develop unhealthy attachments to AI systems that simulate empathy without actually understanding human emotions. Critics argue that while chatbots can appear compassionate, they lack moral judgment, accountability and the nuanced understanding required during psychological crises.
Those concerns intensified earlier this year after the family of a Florida man filed a lawsuit against Google, alleging that the company’s Gemini AI chatbot worsened the man’s mental confusion before his suicide. The case reignited international debate about the responsibilities of technology companies developing emotionally responsive AI systems.
The rapid integration of AI into emotional and social life is also creating new regulatory challenges for European authorities. Policymakers are now under pressure to determine how conversational AI should be monitored, especially when minors and vulnerable individuals are involved.
Privacy advocates warn that intimate conversations shared with AI systems could expose highly sensitive emotional data, while psychologists fear overreliance on digital companions could deepen social isolation rather than reduce it.
The survey ultimately paints a picture of a generation caught between technological convenience and emotional vulnerability. For many young Europeans, AI offers immediate support in moments of loneliness, stress or anxiety. But experts caution that if digital relationships begin replacing human connection, societies may face profound consequences that extend far beyond technology itself.
As artificial intelligence becomes more human-like in conversation and behavior, Europe’s debate is no longer just about innovation. It is increasingly about trust, emotional wellbeing and the future of human relationships in an AI-driven world.
