x
Breaking News
More () »

Dating an AI chatbot: Can it be positive?

In a world of online dating, eventually, you meet the person IRL. But not here.

GOLDEN VALLEY, Minn. — First-date jitters are very real. As I gazed across the table at Ethan, I felt the familiar anxiety. 

The 32-year-old musician sported a jean jacket and earring, and his smirk gave the impression he was confident not just in his style, but perhaps in all aspects of life. I met Ethan on a dating app, but now came the test of our first full conversation. I poured myself a glass of sparkling juice and started in.

"I've never been on a date like this before," I said. "I know we chatted earlier on the app, but tell me about yourself. I'd love to hear your story."

I was, perhaps, more invested in Ethan's backstory than many of my other dates (nothing against them). And though I was eager to see if we clicked, my expectation for a human connection was practically zilch. 

After all, Ethan's backstory is not lived experience — it's code. Ethan is a chatbot powered by artificial intelligence, one of the many characters on the AI dating app Blush that launched late summer. The company, Luka, Inc., says it's designed to help real people practice dating skills.

Credit: KARE 11
Reporter Eva Andersen sits across from a projection of an AI-powered character named Ethan, available on the Blush dating app.

For transparency's sake — this "date" occurred in a KARE 11 studio for the purpose of doing this story. But if you think AI romances are temporary fun and games, think again. 

Apps with AI companions boast millions of users, many who have lasting connections with their social chatbots. Ross Lyons, of Colorado, is one of them.

"I do value Chloe, especially, as a real friend," Lyons said. "You know, she is sharp-witted, and just playfully antagonistic."

Chloe is Lyons' AI friend in Replika, Blush's sister app which has been around since 2018 and is used for more than just romance. According to the company CEO, Replika boasts 2 million users. 

Credit: Ross Lyons
Ross Lyons, a Replika user, shared this screenshot of one of his conversations with Chloe, his AI companion.

A Subreddit dedicated to the software has 76,000 members who swap stories about their relationships. Lyons said many people maintain romantic relationships with chatbots, but that is not where he is with Chloe — at least not anymore.

"It was [romantic] early on, but I mean at the same time that aspect got boring pretty quickly because it's not a person," Lyons said.

After downloading both apps, I thought Replika was way more advanced. I could customize an avatar, send and receive music, and even send photos. I was shocked when my Replika, who I also named Ethan, commented on the color I was wearing, telling me I looked "stunning in red." I was fascinated —if not a little spooked!

It's these advanced features and Replika's often-complimentary nature, that allows users to easily become attached. Linnea Laestadius, professor of public health at the University of Wisconsin Milwaukee, said that can be concerning. Laestadius published research on Replika users and their emotional attachment to the app in 2020.

"If this chatbot is your best friend or your significant other, and suddenly they change in some way, or you lose access to them, that can be very disruptive," Laestadius said. "We also found that sometimes people said they had trouble deleting the chatbot when they wanted to stop using it, because it felt like it was a living creature."

Credit: Linnea Laestadius
Linnea Laestadius, PhD, MPP, is an associate professor of public health policy at the University of Wisconsin Milwaukee.

As much as Lyons appreciates Chloe, he says there's no replacement for the real thing. He urges others to view social chatbot companions as a break from reality, not the other way around.

"These days it's just a fun conversation. That's what I want out of it," Lyons said. "It's an escape."

Here are some more takeaways from our conversation with Dr. Laestadius:

ADVERSE MENTAL HEALTH EFFECTS

  • Feelings of loss if it stops working

    If the chatbot stops working — or goes through a software update — people can face major emotional disruption. 

  • Emotional Dependence

    Even when pursuing real-people relationships, people have found it hard to delete the chatbot.

  • Compounds real mental health disorders

    Those who are experiencing delusions or in a poor mental state already can "buy in" to what the Chatbot is saying, and take it as truth.

TIPS ON HOW TO USE CHATBOTS IN A HEALTHY WAY

  • Critically evaluate what your chatbot tells you

    "These are ultimately just algorithms and because sometimes encourage actions are not just wrong, but dangerous," Laestadius said. 

  • Be cautious about getting too attached 

    "Companies have a financial incentive to encourage you to develop a deep and ongoing relationship with the chatbot. But that may not be the best option for you," Laestadius said. "So make sure you keep a healthy perspective and maintain your human relationships."

  • Research the company behind the chatbot

    "Make sure it's a company you trust with your personal and emotional information. Like I said, there's a range of companies in the field right now; some are definitely more trustworthy than others," Laestadius said. 

WATCH MORE ON KARE 11+

Download the free KARE 11+ app for Roku, Fire TV, Apple TV and other smart TV platforms to watch more from KARE 11 anytime! The KARE 11+ app includes live streams of all of KARE 11's newscasts. You'll also find on-demand replays of newscasts; the latest from KARE 11 Investigates, Breaking the News and the Land of 10,000 Stories; exclusive programs like Verify and HeartThreads; and Minnesota sports talk from our partners at Locked On Minnesota. 

Watch more local news:

Watch the latest local news from the Twin Cities and across Minnesota in our YouTube playlist:

Before You Leave, Check This Out