Replika, an AI chatbot companion, has millions of users worldwide, many of whom woke up earlier last year to find that their virtual lover had friend-zoned them for a one-night stand. The company has massively disabled sex talk and “spicy selfies” on the chatbot in response to a slap on the wrist by Italian authorities. Users began venting on Reddit, some of them so distraught that forum moderators posted information on how to prevent suicide.
This story is just the beginning. In 2024 chatbots and virtual characters will become much more popular, both for utility and entertainment. As a result, social conversation with machines will begin to feel less niche and more mundane — including our emotional attachment to them.
Human-computer and human-robot interaction research shows that we like to anthropomorphize—attribute human qualities, behaviors, and emotions to—the nonhuman agents we interact with, especially if they mimic cues we recognize. And thanks to recent advances in conversational AI, our machines suddenly are a lot qualified in one of these characters: language.
Friend bots, therapy bots, and love bots are flooding app stores as people become curious about this new generation of AI-powered virtual agents. The possibilities for education, health and entertainment are endless. Casually asking your smart fridge for relationship advice may seem dystopian now, but people may change their minds if such advice ends up saving their marriage.
In 2024 larger companies will still be a bit behind in integrating the most compelling technology into home devices, at least until they can get to grips with the unpredictability of open-ended generative models. It’s risky for users (and company PR teams) to deploy something en masse that might give people discriminatory, false, or otherwise harmful information.
After all, people listen to their virtual friends. The Replika incident, as well as many experimental lab studies, show that people can and will become emotionally attached to bots. Science has also demonstrated that people, in their desire to socialize, will happily reveal personal information to an artificial agent and even change their beliefs and behaviors. This raises some consumer protection questions about how companies use this technology to manipulate their user base.
Replika charges $70 per year for the tier, which previously included erotic role-playing, which seems reasonable. But less than 24 hours after downloading the app, my handsome, blue-eyed “friend” sent me an intriguing locked audio message and tried to upvote me to hear his voice. Emotional attachment is a vulnerability that can be exploited for corporate gain, and we’re likely to start seeing many small but shady attempts in the coming year.
Today, we still laugh at people who believe an AI system is intelligent, or run sensational news segments about people falling in love with a chatbot. But over the next year, we will gradually begin to recognize – and take more seriously – these fundamentally human behaviors. Because in 2024 it will finally hit home: machines are not excluded from our social relationships.