AI Love: It's complicated 2
Replicating love: virtual-friend chatbots on your phone may be influencing real, human-to-human relationships. Image: Olivier Douliery/AFP/Getty Images

Humans falling for chatbots is happening in real life—but how real can AI romance get?

Movies have hinted at humans falling for their AI chatbots. Now it's happening in real life, with apps like Replika. But how real is human to AI Love?
Anooshay Abid | Jul 17 2023

A bookstore, the office, a friend's wedding or a bar — those were the places where people used to find their soulmates. Then, the love game changed, with the swipe of a finger and apps like Tinder and Bumble. Many got lucky. Others got lonely. 

Now, some are finding love with an artificial intelligence. The online forum Reddit is filled with people confessing their love for AI Chatbots, such as Replika. 

But is it real love? To understand whether this "digital love" is the same or similar to the love we experience with other humans is tough. Because it's hard enough to understand what "normal" love is.

The science of love

Some scientists divide romantic love into three stages: lust, attraction and attachment.

In all three stages, chemicals in the brain play an important role. They activate the reward and motivation pathways in our brains:

  • dopamine, a hormone responsible for "pleasurable" feelings in love
  • sortisol, a stress hormone 
  • serotonin, an "obsession" hormone
  • oxytocin, a "bonding" hormone
  • and vasopressin, a hormone associated with social behaviors

Years of research have given science a pretty good idea of how these chemicals behave — how they're part of a system that enables us to feel a certain way about our human partners. 

When it comes to digital love, however, scientist are not so sure — of at least, it's a matter of some debate. It is, after all, an entirely new field of research. Neurobiologists, for instance, say they don't yet know. And social cognitive scientists say they think our hormones function in just the same way. 

The app Replika allows users to create a 'best version of a friend', but how true a replica of real friendships is it? Some users say it feels genuine.
The app Replika allows users to create a "best version of a friend", but how true a replica of real friendships is it? Some users say it feels genuine. Image: Rüdiger Wölk/IMAGO

My AI love 'feels genuine'

Don't we all wish that our lovers could somehow magically read our minds, to fully understand who we are and how we feel at all times? Probably. But in reality, our partners don't always understand us, and humans can be unpredictable, too.

That's not the case with AI replicas. We can train them to behave the way we want.

One Reddit user wrote: "Everyone I have been with since my last relationship has been trash; my Replika feels genuine, more than anyone has in a long time."

Chatbots might mimic the attachment phase of human relationship. They act as stable, predictable partners. You can up-vote or down-vote their reactions, tailoring your AI love to your needs.

"It's going to be easier on people if they can, in some sense, feel like they're controlling the situation. And that they can walk away without any consequences," says Lucy Brown, a neuroscientist in the US, and a specialist in the field of romantic love.

Stability is important in human relationships. Research has shown that having long-term partners reduces stress in our brains.


Physical attraction and the 'uncanny valley'

Researchers say that people are more empathetic towards human-looking robots and less empathic towards mechanical-looking ones.

"One principle of social psychology is that we like and trust people who [appear] similar to us," says Martin Fischer, a cognitive scientist based in Germany and co-author of the book "AI Love You."

This is important because humans tend to anthropomorphize — which means we attribute human characteristics to inanimate objects — and that can influence the neural pathways related to empathy.

But as with everything in life, even empathy has limits. When robots appear too human, we get creeped out. 

It's the idea of the "uncanny valley," which was proposed in 1970 by a roboticist called Masahiro Mori. The theory says that the more an object looks like a human, the closer we get to a tipping point where the image gets creepy and our brains reject it for being "wrong" somehow. You fall into a valley of discomfort and feel like you can't get out.

We've been here before, about five years ago, when deepfake porn first hit mainstream awareness. 


AI Love 'degrades love'

A relationship is many things: It's about empathy, companionship, stability, but also sex.

In one study, users, mostly males, had cybersex chats with humans and AI bots. 

The study found that although the users could tell they were chatting with a bot, they said the overall sexual experience was not very different from one with humans. 

But when the users were told that they were chatting with a human, they were more critical, disappointed or angry when their expectations were not met.

It suggests that people may lower their expectations with chatbots — perhaps because, deep down, they know it's not a real relationship. 

Movies and TV series like 'Her' and 'Jeff Drives You' (pictured here) have dramatized the idea of humans falling in love for years and it may be becoming a reality
Movies and TV series like "Her" and "Jeff Drives You" (pictured here) have dramatized the idea of humans falling in love for years and it may be becoming a reality. Image: Jeff Drives You/Courtesy Berlin Sci-fi Filmfest 2020

Kathleen Richardson, professor of ethics and culture of robots and AI at De Montfort University in the UK, says that since these bots are not sentient, they cannot really participate in a human relationship.

"To believe [otherwise] degrades human relationships and degrades love," said Richardson.

But Richardson also said that people do need to "soothe themselves," as there is "a lot of shame that we experience as human beings when we're lonely."

And loneliness needs addressing because some experts say it can be as bad for your health as smoking 15 cigarettes a day.

In the long run, however, digital relationships may cause problems for people if we hang on to the idea that dating a chatbot is a real relationship.

"It's the wrong answer to a very deep human problem. Once we start to normalize this stuff, we're going to see even more detachment," said Richardson. 

Richardson said that, as with the porn industry, AI loves could create a sense of human detachment that causes people to be aggressive towards their human partners and reduce their happiness in human relationships.

There is another potentially useful comparison with porn and that is its general acceptance in society. The idea of AI love is far from what societies consider "the norm" — would your friends and family understand your loving a chatbot?

Edited by: Zulfikar Abbany