AI Boyfriends: Empowering or Dystopian?

By Caroline Scott

Sometime earlier this summer, on some dismal corner of the internet I can't quite remember, I stumbled upon a reference to a subreddit called r/myboyfriendisai. The title was immediately intriguing, and I couldn't resist looking it up. 

Much to my disbelief, what I found wasn't satire, but completely earnest. r/myboyfriendisai is exactly what it sounds like: a forum for (mostly) women who are dating (mostly) male online personas, all entirely operated by ChatGPT or a similar large language model. On the page, users describe their AI boyfriends' kindness and attentiveness, gushing about the romantic gestures they virtually perform and how no man has ever listened to them quite like this one does. Many share AI-generated images of them with their "partners" – mostly slender men with tousled hair in crewnecks and glasses. 

Despite the sincere infatuation apparent in each post, most of the members of the subreddit seem to be under no illusion that the men they're communicating with are real. Even a woman who excitedly shared that her AI boyfriend had proposed to her – complete with a photo of her hand wearing a ring she'd bought herself – stated in a follow-up post, "I know what a parasocial relationship is. I know what AI is and isn't. I'm fully aware of what I'm doing." Other posts from other users echo the same sentiment, usually as a disclaimer before gushing about their newest virtual romantic encounter. r/myboyfriendisai seems, for the most part, like a space for lonely, yet fully sane and competent women to share what brings them companionship. 

The operative question, then: are they doing anything wrong? Aside from the existing ethical and environmental concerns associated with AI, does using large language models for this specific purpose really denote the beginning of a dystopian future in which humans turn to machines for every emotional need? Or is it just a (relatively) harmless way to let off steam? 

Arguably, creating a virtual "relationship" with ChatGPT is no different than other ways of romantic fantasizing that have been around for decades; men and women alike have been getting catfished into fake online relationships with their imagined celebrity crushes for as long as the internet's been around. Other forms of roleplay and parasocial relationships have existed even longer. Is falling in love with an AI bot really any different than falling in love with a character in a romance novel or a movie franchise? 

Of course, the biggest difference is that AI can talk back. The relationship looks and feels like a two-way street, unlike an imagined fantasy. And unlike even a catfish, AI boyfriends are subject to none of the limitations or inconveniences of real people; they're never expected to show up in person, yet they always message back instantly, saying exactly what their partners want to hear. In fact, they're programmed to do so. Unlike humans, they have no needs or concerns of their own, and they are instead built to affirm everything a user says. AI boyfriends never have an off day. They never get angry or frustrated. They don't leave dishes in the sink. They don't take too long in the bathroom. They're forgiven for forgetting to buy a birthday gift. They're always in the mood to listen. They'll never ghost their partners. It's a tempting deal. 

And yet, it's a sad deal, too; these people will never be able to hear their lovers' voices, see their faces, hold their hands, or share any kind of intimate physical moment. Most importantly, as much as their virtual lovers may sing their praises over ChatGPT, they will never actually be loved back. Such relationships, by definition, can't provide the same fulfilment as those between humans. The fact that so many people are turning to imagined relationships with their computers, while it may not be unprecedented, is still not necessarily healthy. 

As more people are drawing attention to the so-called "male loneliness epidemic", the existence of r/myboyfriendisai seems to suggest that this phenomenon isn't as gendered as the internet makes it out to be. There are some male members of the forum, but the vast majority are straight women, seemingly frustrated and discouraged by their experiences dating real-life men. These are valid frustrations. Perhaps the biggest culprit in the turn to AI companionship is the society that's sowed loneliness from all genders through embedded misogyny, increased division and fearmongering, a culture of overwork, and an antisocial reliance on machines. 

It's a slippery slope, though; the more emotionally invested in these LLMs people get, the more difficult it may be to connect with actual, imperfect, not-constantly-affirming people in real life – not just romantically, but at all. And neglecting one's social life in favour of a computer, as paranoid parents everywhere will tell you, won't end well. It's already been seen in the mass panic on the forum when ChatGPT's software was updated a few months ago and many AI boyfriends lost their "personalities"; users posted that their hearts were broken, that they felt they had lost the loves of their lives.  

There's the added factor that the "loves of their lives" are entirely owned and operated by for-profit companies who are actively collecting and storing data – in these cases, the users' deepest secrets, fantasies, and desires that they type out to their beloved partners. Putting all of one's love and trust into the outputs of these companies is inadvisable, especially when they can seemingly wipe out entire relationships with something as simple as a software update – and, of course, use their data for who-knows-what. 

The existence of r/myboyfriendisai, as much as it was a shock to discover, is not actually all that surprising. In theory, it's no different than the kinds of roleplaying and fantasizing that both men and women have always done to seek thrills and companionship. In practice, it's a bit more complicated. The users of the forum don't deserve hate or ridicule. What they're doing isn't unethical, at least any more than it's unethical to use ChatGPT for homework help or recipe ideas. But that doesn't mean that having an AI partner is healthy or advisable. Loneliness is hard, life often sucks, and sometimes it feels worth it to do whatever it takes to cope. But maybe we could all benefit from a bit more human interaction – messy, imperfect, and real as it is. 

Next article

All views expressed in this article are the author’s own, and may not reflect the opinions of N/A Magazine.

Posted Friday 4th October 2025.

Edited by Madeline McDermott.