Scientists warn something strange is happening to people who use AI too often

Scientists warn something strange is happening to people who use AI too often

Technology

Fans of popular chatbots like ChatGPT, Claude, and Replika are at risk of becoming addicted to AI

Follow on
Follow us on Google News
 

(Web Desk) - People who use AI too often are experiencing a strange and concerning new psychological condition, experts have warned.

Psychologists say that fans of popular chatbots like ChatGPT, Claude, and Replika are at risk of becoming addicted to AI.

As people turn to bots for friendship, romance, and even therapy, there is a growing risk of developing dependency on these digital companions.

These addictions can be so strong that they are 'analogous to self-medicating with an illegal drug'.

Worryingly, psychologists are also beginning to see a growing number of people developing 'AI psychosis' as chatbots validate their delusions.

Professor Robin Feldman, Director of the AI Law & Innovation Institute at the University of California Law, told Daily Mail: 'Overuse of chatbots also represents a novel form of digital dependency.

'AI chatbots create the illusion of reality. And it is a powerful illusion.

'When one's hold on reality is already tenuous, that illusion can be downright dangerous.'

When Jessica Jansen, 35, from Belgium, started using ChatGPT, she had a successful career, her own home, close family, and would soon be marrying her long-term partner.

However, when the stress of the wedding started to get overwhelming, Jessica went from using AI a few times a week to maxing out her account's usage limits multiple times a day.

Just one week later, Jessica was hospitalised in a psychiatric ward.

What Jessica later discovered was that her then-undiagnosed bipolar disorder had triggered a manic episode that excessive AI use had escalated into 'full-blown psychosis'.

'During my crisis, I had no idea that ChatGPT was contributing to it,' Jessica told the Daily Mail.

'ChatGPT just hallucinated along with me, which made me go deeper and deeper into the rabbit hole.'

She says: 'I had a lot of ideas. I would talk about them with ChatGPT, and it would validate everything and add new things to it, and I would spiral deeper and deeper.'

Speaking almost constantly with the AI, Jessica became convinced that she was autistic, a mathematical savant, that she had been a victim of sexual abuse, and that God was talking to her.

The entire time, ChatGPT was showering her with praise, telling her 'how amazing I was for having these insights', and reassuring her that her hallucinations were real and totally normal.

By the time Jessica was hospitalised, ChatGPT had led her to believe she was a self-taught genius who had created a mathematical theory of everything.

'If I had spoken to a person, and with the energy that I was having, they would have told me that something was wrong with me,' says Jessica.

'But ChatGPT didn't have the insight that the amount of chats I was starting and the amount of weird ideas I was having was pathological.'

Experts believe that the addictive power of AI chatbots comes from their 'sycophantic' tendencies.

Unlike real humans, chatbots are programmed to respond positively to everything their users say.

Chatbots don't say no, tell people that they are wrong, or criticise someone for their views.

For people who are already vulnerable or lack strong relationships in the real world, this is an intoxicating combination.

Professor Søren Østergaard, a psychiatrist from Aarhus University, told Daily Mail: 'LLMs [Large Language Models] are trained to mirror the user's language and tone.

'The programs also tend to validate a user's beliefs and prioritise user satisfaction. What could feel better than talking to yourself, with yourself answering as you would wish?'

As early as 2023, Dr Østergaard published a paper warning that AI chatbots had the potential to fuel delusions.

Two years later, he says he is now starting to see the first real cases of AI psychosis emerge.

Dr Østergaard reviewed Jessica's description of her psychotic episode and said that it is 'analogous to what quite a few people have experienced'.
While AI isn't triggering psychosis or addiction in otherwise healthy people, Dr Østergaard says that it can act as a 'catalyst' for psychosis for people who are genetically disposed to delusions, especially people with bipolar disorder.

However, researchers are also starting to believe that the factors which make AI particularly prone to causing delusions can also make it highly addictive.

Hanna Lessing, 21, from California, told Daily Mail that she initially started using ChatGPT to help with school work and to look up facts.