hands-touch

the ai girlfriend

Introduction

Picture this: you’re a 30-something year-old man, recently single and recovering from a breakup that happened earlier on in the year. Months have passed but you still find yourself asking the same questions, dwelling on the same mistakes, searching for the same answers. You start listening to world-renowned psychologist Esther Perel in hopes for clarity and decide to book an appointment. Only, her calendar is packed — leaving you with no choice but to wait for her next opening.

Unless, of course, there were two of her.

In 2023, Alex Furmansky set to duplicating Esther Perel through the use of AI. He made a chatbot that mimicked her same tone and style using her corpus of work and OpenAI’s API: in other words, by uploading her books, podcast episodes, and other data, the AI bot ‘learned’ how she would respond to a particular prompt. Furmansky left the conversation  “with the most clarity [he’s] ever had”, feeling the experience was nothing short of “magical”. 

But can AI actually give us useful insight on relationships? Does establishing a ‘connection’ with an AI bot make us less lonely, or does it do more harm than good?

Incel Culture and the Loneliness Epidemic

There’s no denying there’s a loneliness epidemic — in 2022, more than one third of EU respondents were lonely at least sometimes — and people are looking for ways to find company. After all, we are social creatures, and one of the main pillars of human happiness is a feeling of connection, community, and belonging. You might expect social media and technology to increase connectivity (shouldn’t it be easier to stay in touch?) but it turns out we’re feeling more alienated than ever. The thing is, social media is not a social platform; it is a product. And its effects on consumers are severe.

Let’s have a look at a particular group of consumers, known familiarly as ‘incels’. Incel (‘involuntarily celibate’) culture is an online subculture primarily composed of heterosexual men who are unable to form relationships. Characterised by their often misogynistic, violent, and extremist views, the U.S. Secret Service’s National Threat Assessment Center, quite literally, identifies incels as a growing terrorism threat. 

These men do not have meaningful, fulfilling relationships in their lives. Their dangerous views are only supported and radicalised within this like-minded group, made more and more extreme the longer they stay. These people are lonely, and turn to violence as a result.

But what does this have to do with AI? Would AI decrease their dangerous levels of loneliness, or would it only exacerbate incel culture? 

Deepfakes

In January 2024, sexual deepfakes — artificially-generated videos of people with digitally altered faces or bodies — of Taylor Swift began to circulate on X. They had initially been created on 4Chan, known for its prominent incel presence, and became viral within hours. This has happened before, to women from underage actresses to TikTok celebrities, and has had devastating impacts on their mental health. Imagine entirely fake, but incredibly realistic, videos of your face on a naked body became viral online. People message you, rumours circulate, but all in all— you feel almost violated. Although this isn’t even your body, you haven’t consented to this depiction of yourself, and now it’s everywhere. The idea of it is horrifying.

This has obvious repercussions for the women affected, but also on the men who do this. It sells intimacy as a product, generated instantaneously with a tap of a finger, altering our expectations and perception of consent. It creates the “ideal woman”, when and how you want it– a feat impossible to achieve in the real world. 

Chatbots

Furthermore, this extends to other uses of AI, which brings me back to the original topic of chatbots. The majority of people who use AI companion bots “have experienced loneliness in the past” or have “more severe forms of loneliness that they’re going through.” That is not to say they are all incels, but it does raise some red flags. Again, when you have access to the ‘ideal woman’ 24/7, you begin to push this expectation onto real women. And when the real woman says no, it only exacerbates the belief of the incel— that they are not made for relationships, that all women are bad, and that they simply cannot understand why they are being rejected. 

Discomfort and failure are essential parts of the human experience. AI chatbots remove this. There is no responsibility on your part, no need for empathy. The lack of compromise only increases our already short attention spans and our obsession with instant gratification. In this podcast episode, Escher Perel (the real one!) explains how growing up and maturing is accepting the idea of delayed gratification. As we grow, we begin to think not only of ourselves but of other people. But with an increased dependency on AI chatbots, there is no need to think about what the other is feeling; there is no maturity involved, and hence an incel stays an incel, and incel-induced violence continues.

Not So Bad?

We’ve established the potential adverse consequences of AI companions, but what about potential positives? 

The positives aren’t as straightforward to spot as their negative counterparts. In fact, they’re not objectively positive at all, so I leave these case studies open to interpretation.

Furmansky was able to access some kind of consultation with Escher Perel for free, and without needing to wait. Whether or not the conversation was actually enlightening, or whether we should be relying on a non-sentient chatbot for sentimental advice, is somewhat questionable— but either way, he did feel positively afterwards. If an AI chatbot is capable of producing positive emotions in us, clarifying doubts and setting our minds at ease, it should be beneficial, or at least to some extent. 

Let’s have a look at another example. Derek Carrier, 39, had found dating difficult in the past — he had a genetic disorder, hadn’t had a girlfriend before, and didn’t have a steady career. And when he started flirting with a chatbot, he experienced symptoms akin to the real-life phenomenon of ‘having a crush’: trouble sleeping, romantic emotions, etc. These emotions were not made-up, much like Furmansky’s newfound clarity, albeit in a different, romantic realm that raises even more questions about human/AI interaction. He was able to experience these things, and maybe improve his confidence. But what if he never did — what if he decided to settle for the ideal, but non-sentient, girlfriend instead?

We’ve already started developing attachments to these bots, whether romantic or not. We give them names: all these ‘virtual assistants’ (Siri, Cortana, etc.) have become almost human-like in our eyes. I know people who ask ChatGPT questions followed by ‘please’ and have genuinely found myself thanking it after it provides a half-decent response— do we see large language models (LLMs) as human-like  because they’re capable of conversation? In the 1950s, Alan Turing suggested a computer was capable of intelligent thought if its conversation was indistinguishable from that of a human’s. So if an AI bot is causing us to experience human emotions, like any other human might do, what’s the benefit of human interaction anyway?

Conclusion

I’m excited, and apprehensive, for these future developments. I can envision a world where people end up preferring easy, no-risk, no-compromise AI companions to their human counterparts— though I hope I’ve raised pretty good awareness of the devastating consequences this might have. I’m all for using AI as a complement to the human experience, but I wonder if an overreliance might cause the downfall of humans as a social species as we know it. 

Let me know what you guys think, and I hope this has given you some food for thought.

  • Carlota 🙂

Extra

This quote is from Kazuo Ishiguro’s fictional novel Klara and The Sun. The quote talks about whether a physical AI robot has the ability to replace Josie, a human child. I’ll leave it here for you to think about.

“What we made with Sal was a doll. A bereavement doll, nothing more. We’ve come a long, long way since then. What you have to understand is this. The new Josie won’t be an imitation. She really will be Josie. A continuation of Josie.”


Comments

Leave a Reply

Your email address will not be published. Required fields are marked *