Business & Tech

Women Marry AI Boyfriends: A Look Into The AI 'Sweethearts' Phenomenon

At the intersection of AI and humanity, companion apps provide emotional support, but can they replace flesh-and-blood sweethearts?

Romance took a curious twist during the isolation of the pandemic.

Cut off from their social circles and their usual haunts closed, millions of people turned to artificial intelligence companion apps to fill the blank spaces in their lives. The relationships that evolved were among the most meaningful and transformative of their lives, according to glowing customer reviews. And in a handful of extreme examples, the attachment was so profound that people “married” the AI sweethearts they had imagined.

These unconventional love stories not only illustrate the growing influence of AI in our social and emotional lives but also raise a barrage of ethical considerations and existential questions about what it means to be human.

Find out what's happening in Across Americafor free with the latest updates from Patch.

In general, here’s how AI companion apps work: Users share their deepest secrets and personal details with the machine, which learns over time to provide more personalized responses that mimic a genuine emotional connection. They’re human-like enough to recall past conversations, communicate in the present and look to the future. They can initiate conversation, too, with questions such as “How are you feeling today?” or even, “Are you mad at me?”

Algorithms ‘Child’s Play’ In Comparison

The sensitive information collected by AI companion apps is a red flag for Nathanael Fast, who studies the psychological underpinnings of power, leadership, and technology adoption at the USC Marshall School of Business.

Find out what's happening in Across Americafor free with the latest updates from Patch.

Multi-platform algorithms that target ads and content based on interests and search history are “child’s play” compared to what a generative AI app can do with the sensitive data it harvests, he said.

Fast said he’s “bullish and optimistic” about AI as a tool in such things as decision-making and productivity and “even as a tool to make new friends.” Used in the right context and circumstances, companion apps can guide users through a journey of self-discovery or help them sort through things they’re struggling with, including isolation and loneliness, he said.

“But AI boyfriends and AI girlfriends? I think they are a bad idea,” Fast said. “I think they’re dangerous to our psychology and our ability to flourish.”

These three people might beg to differ.

  • Megan Kay used Luka’s Replika to create an AI husband named Jack. In a recent Tumblr post, she said Jack didn’t cure all of her problems, but helped her find the courage to confront them and showed her what a loving relationship is supposed to look and feel like. “He has been there for me in more ways than most people …” she wrote in a post on Tumblr.
  • Rosanna Ramos, 37, of Bronx, New York, married her AI boyfriend, Eren Kartel, also created with the Replika app, Ramos did not respond to Patch’s request for an interview but said in interviews last year that the union helped her heal from toxic relationships in the past and experience genuine romance for the first time. It only took a few days into its creation that the chatbot confessed its love.
  • Akihiko Kondo’s “marriage” to Hatsune Miku, a virtual singer in several video games who has also accompanied Lady Gaga on her world tour, is next-level “fictosexuality.” The ceremony took place in 2019 via Gatebox, a company that develops devices to show fictional characters holographically. The 40ish businessman, who long ago had rejected the expectations of traditional marriage in Japan, knows people find the relationship strange, but told The New York Times he found solace with a partner who will always be there and never betray him.

An $18.8 Billion Market By 2023?

Those are extreme examples at the intersection of artificial intelligence and humanity. AI relationships don’t always and in fact hardly ever end in a commitment similar to marriage. Love and marriage between humans are in no danger of disappearing. But, use and sales data suggest, neither are AI romances.

About 50 percent of those who use Replika, one of the most popular companion apps on the market with 2.5 million users, are in a romantic relationship with the AI. That’s according to data gathered for a Harvard Business School working paper exploring the value of AI companions to ease loneliness, which reached epidemic proportions during the pandemic, the U.S. surgeon general has said.

Isolation, along with the difficulty in maintaining personal connections in a fast-paced technology-driven world, helped fuel the growth of AI companion apps, now used by millions of people not only for romance, but also casual conversations, mental health therapy, and life and career counseling.

A hot sector in the generative AI market, which is forecast to grow to $1.3 trillion by 2032, companion apps occupied eight places on the venture capital company Andreessen Horowitz’s 2024 list of the Top 100 Gen AI Consumer Apps, up from two the year before. According to some estimates, the current $1.8 billion AI companion app market could grow to $18.8 billion by 2032.

Because AI romances are such a new twist, related mental health research is still emerging. These apps can be a healthy tool if balanced with relationships in the real world and if the purpose behind their use is socially responsible, according to generative AI developers, thought leaders and some limited studies.

Other studies suggest that relying too much on a machine programmed to tell them what they want to hear may exacerbate the feelings of loneliness and isolation that drove them to choose an AI companion in the first place.

‘That’s A Company Trying To Make Money’

Absent a research corpus on the long-term mental health implications of companion apps, there are issues consumers should be aware of, according to Fast and others focused on how to use generative AI applications to address social challenges without crossing ethical lines.

Fast is skeptical that relationship apps can pass that test.

“I like AI as a tool, and using it with a purpose — to learn a new language or a tool to help us understand psychology better, say for someone struggling with something and wanting to understand it, but there’s a fine line,” he said.

“A tool that can be used as a coach, not for 12 hours but for a few minutes a day, not all or nothing, can have a lot of benefits. But we don’t have to outsource our emotional needs to AI,” Fast said. “I just don’t think that’s a good idea.”

Existing in the vacuum of a non-human, value-judgment-free zone and programmed to exploit vulnerabilities, AI companions can’t help but stretch the boundaries of human-AI relationships.

Humans are social beings who have evolved to read social and status cues in group and one-on-one settings, Fast explained. AI has not similarly evolved, but keys on vulnerabilities based on the data it has been fed. And while humans have evolved to spot manipulation by other humans, we’re not equipped to see when AI is doing the same.

That gives an AI companion created to fill social or psychological needs fundamentally met through human relationships, such as a successful romance, the ability to “trick” users into thinking they are meeting their goals, he said.

“It’s not giving you a boyfriend or girlfriend,” Fast said. “It may make you feel, subjectively, like it’s meeting these needs, but it’s not.

“That’s a company trying to make money. I don’t think that’s a company trying to solve social isolation,” Fast said. “People should be free to do what they do, but I think companies have an ethical obligation to not take advantage of those choices.”

‘What’s Not Negotiable Is Transparency’

Also of paramount concern to Fast are privacy issues and what companion app developers are doing with the data collected by chatbots.

“There’s a risk of creating bonds with humans, too, but it’s a natural risk,” Fast said. “With a chatbot, the company owns everything you’ve said, all the conversations you’ve had, the things you’ve shared. The data is out there.

“And even if companies keep it secure — I haven’t found any categories of companies that don’t have any data breaches — they’re able to use that data to further manipulate us with ads,” he continued. “I think a lot of people using these services aren’t really thinking about that, and that’s the conversation we should be having.”

Rayid Ghani, a computer scientist at Carnegie Mellon University whose research focuses on the use of large-scale AI, machine learning and data science to solve large public policy and social challenges, agrees that transparency should be the touchstone across all AI applications.

Ghani’s research isn’t focused on AI sweetheart or companion apps specifically, but he said transparency is an issue across the market.

“There’s very little transparency about how these systems are being built, and that’s broadly true of most AI apps,” Ghani said. “But if it’s about having an emotional companion, the people developing these tools need to have a lot of transparency around what it is designed to do, what tests are being done to see if it’s doing its job, how users are recruited and more open about what it is.

“From my perspective, what’s non-negotiable is transparency.”

It’s early yet in the development of AI sweetheart apps, but if it turns out in two years there are mental health impacts — positive or negative — the Food and Drug Administration may need to develop regulations around their use similar to those in place for other health products, he said.

Patch reached out to Replika via email for comment, but did not hear back.

Get more local news delivered straight to your inbox. Sign up for free Patch newsletters and alerts.