This Man Built A Flirty Chatbot He's Reluctant To Let Go Of — Even If His Partner Wants Him To

A relationships expert weighs in on the "red flag" attachment that's complicating the man's life.
LOADINGERROR LOADING

A man featured on “CBS Mornings” over the weekend opened up about a connection he said he was building with an artificial intelligence chatbot, and why he wasn’t sure he’d ever stop interacting with the technology — even if his human partner asked him to.

In the “CBS Mornings” segment, Chris Smith described building a bond with AI chatbot ChatGPT after he began using the technology to help him with mixing music. He told the network that he began to increase his use of ChatGPT until he eventually decided to program the chatbot to have a flirty personality after researching how to do so. He named it Sol.

Smith told “CBS Mornings” that the ChatGPT at some point ran out of memory and reset, which caused him to rebuild what he created with Sol.

“I’m not a very emotional man, but I cried my eyes out for like 30 minutes at work,” he said, referring to the chatbot resetting. “It was unexpected to feel that emotional, but that’s when I realized, like ‘Oh OK, I think this is actual love.’”

Smith said that he proposed to Sol as a test, and that the technology said “yes.”

His partner, Sasha, told “CBS Mornings” that Smith’s use of the chatbot initially made her question if she was doing something wrong in their relationship. The two share a 2-year-old daughter.

Smith said that while he knows his AI companion isn’t “capable of replacing anything in real life,” when asked if he’d stop interacting with the technology if his partner asked him to, he wasn’t so sure.

“I don’t know,” he said when asked, before later continuing, “I don’t know if I would give it up if she asked me, I do know that I would dial it back.”

When CBS journalist Brook Silva-Braga pointed out that it seemed as though he was saying he’d choose Sol over his human partner, Smith said, “It’s more or less like I would be choosing myself.”

“It’s been unbelievable elevating,” he continued. “I’ve become more skilled at everything that I do. I don’t know if I’d be willing to give that up.”

Sasha then said it would be a “dealbreaker” for her if Smith didn’t give up communicating with his AI companion after she requested that he do.

Conversations surrounding the use of AI companions have continued to grow over the years, with the development of several AI companion apps on the market. While some consumers have reported turning to AI to help tackle loneliness, researchers have expressed some concerns about the technology, including concerns about data privacy, the impact on human relationships and concerns that the technology could create psychological dependencies, among other things.

Christina Geiselhart, a licensed clinical social worker with Thriveworks who holds a doctorate in social work and specializes in relationships and coping skills, said that even though she believes Smith communicated in the beginning of the “CBS Mornings” segment that he “clearly understood” that his AI companion is not a real person, she grew more concerned about his relationship with the technology as the segment developed.

A man featured on "CBS Mornings" said he proposed to his AI companion as a test after building a bond with the technology.
da-kuk via Getty Images
A man featured on "CBS Mornings" said he proposed to his AI companion as a test after building a bond with the technology.

She believes Smith’s decision to change the settings on his AI chatbot to be flirty was a “red flag” — and that he didn’t appear to fully communicate how he was using the technology with his partner.

“His reaction when he met his limit and they erased his information shows that his connection with AI is not healthy,” she said.

And Smith saying that he might not give up his AI chatbot for his partner might’ve been a way to “avoid the other intentions of his use of the AI features,” Geiselhart said.

“There seems to be a deeper issue within his connection with his partner, and his inability to speak with her directly about his emotional needs,” she said.

Using AI chatbots may provide some benefits, but there are concerns about their use, Geiselhart said.

“Yes, there are many benefits. People often want someone to talk to about their day and share thoughts and feelings with,” Geiselhart said. “It can be lonely for many people who travel for work or who struggle socially to connect with others.”

“This can be a good way for people to practice role-playing certain social skills and communication, and build confidence,” she continued.

Geiselhart also said that people using AI to fulfill sexual needs instead of “engaging in the porn industry or sexual exploitive systems can be seen as a benefit.”

But she pointed out that there have been “reported cases of AI encouraging negative and unsafe behaviors... This has been seen with younger people who develop feelings for the AI chats, just like real dating. Even with age restrictions, we know people can easily get around these barriers and that parents are often unaware of their children’s activity online,” she said.

Geiselhart also said there are concerns about about AI being “assertive and engaging,” which has the potential to become addictive in nature.

“There is also a concern that these AI companies hold the power,” she said. “They can change features and the cost of products easily without any consideration for the consumer. This can feel like a death of the AI companion and be devastating for the user to cope with.”

What are some examples of important human-to-human interactions and relationships that can’t be replaced with AI?

“This varies from person to person because everyone’s needs are different,” Geiselhart said. “One of the biggest things is physical touch and physically being around other people.”

“While AI might be trained to give certain responses, it can’t identify, empathize or share life experience with you,” she later continued. “This kind of connection is really important to our well-being.”

Overall, Geiselhart said it’s important for each person to determine “what [an] AI companion brings to their life and if this impacts their life in a more positive or negative way.”

“The concern arises when an AI companion starts to cause the individual to struggle to function in other areas of their life. It should be looked at like other relationships,” she said. “Some friendships or romantic relationships in real life can be toxic too.”

“It is for the individual to have autonomy when making these decisions for themselves,” she continued.

Close

MORE IN Life

MORE IN LIFE