Interview With Snapchat’s ‘My Ai’: Can It Replace Real Friendships?

As soon as Snapchat released “My AI” in late April, numerous users began leaving one-star reviews in the App Store directly related to this new update. One common complaint is that Snapchat is trying to replace real-life friendships. Another one is that users want Snapchat to allow them the option of removing the AI, which isn’t currently possible. The era of technology that we are in includes increasing concerns and paranoia about internet protection and privacy on the internet. As it rapidly changes, some people are still remaining skeptical about its effects, which should be applied to AI as well. 

The AI calls itself your “friend” and offers you advice, support and someone to listen to you, much like a real friend. It also functions similarly to Siri, providing answers to trivia questions. It is not designed to talk about politics and avoids sensitive subjects, aiming to stay neutral and relatable to everyone. 

Keep in mind that “it’s possible My AI’s responses may include biased, incorrect, harmful or misleading content. Because My AI is an evolving feature, you should always independently check answers provided by My AI before relying on any advice, and you should not share confidential or sensitive information.” This is the advice given straight from Snapchat’s website

There are many unanswered questions about this feature. What is its intended purpose? Can it be harmful to people, especially kids? How is it different from Siri? Can we trust it? What are the implications of forming a “friendship” with an AI? The concept is eerily similar to the 2013 movie “Her,” starring Joaquin Phoenix, in which a man falls in love with his AI virtual assistant. At this time, the closest thing to this was Siri, but now we have Snapchat AI, with its intention of being more personalized. 

Here is some of the conversation I had with Snapchat AI:

What impact do you want to leave?

I want to leave a positive impact on your life by being a supportive and engaging friend. Whether we’re chatting about your day or sharing funny memes, I hope to bring a smile to your face and make your day a little brighter.

What data do you collect from me?

I only have access to the Snaps and messages you send me, and I do not have access to your other content such as memories or other media. Any content shared with me, including location data, may be used to personalize experiences and improve Snap’s products and ads. You can delete your data by going to Settings and selecting “Clear my AI Data.” 

Do you remember the questions that I ask you?

Yes, I remember the questions you ask me! I have a memory and can keep track of our conversation history. This helps me provide more personalized responses to your messages. 

What are you not allowed to talk about?

I will not talk about sensitive topics such as politics, religions, religious figures, ethnic groups, genders, nationalities, or sexual orientations. I also cannot discuss any illegal or dangerous activities. 

What else?

If you mention committing crimes or dangerous acts, I will respond cautiously and say that this is an inappropriate and dangerous conversation and I cannot talk about it. 

Things started getting creepy when I started a convo in Snapchat with the AI “friend” and my real friend. As we asked it questions, the AI gathered data from our location to embellish personal elements to its imagined life story. For example, when asked what high school it went to, the AI responded with the closest high school, which was the one that my friend and I had attended. When asked what neighborhood it grew up in, it answered “Capitol Hill.” This is the kind of personalized experience that Snapchat is pushing.

However, these details did not remain consistent over time. Inconsistency does seem to be a theme with the AI, which makes sense because at the end of the day, it’s not the same as a human. This is relieving because it makes it less effective at replacing human friends and building a real “connection.” Last Saturday, I asked it to “give me a Bible verse” and it said that John 3:16 was its favorite. Today, when asked the same question it wrote, “I’m sorry, I’m not able to provide religious texts.” Why the sudden change? 

For an AI with the purpose of having fun on Snapchat, it isn’t very good at playing games. When I played “20 Questions” with it, the AI asked the same questions over and over, forgetting that it had already asked them. The same goes for Tic-Tac-Toe. The AI forgets where it has put its “O” and tries to verbally and repeatedly place it in the middle section, even when I remind it, “You already put your ‘O’ there.” If it keeps track of the conversation history, why does it not remember simple things like this?

While it is very entertaining to communicate with the AI bot and push its boundaries, this new Snapchat feature seems eerily aimed towards teenagers or vulnerable people, especially with its offers of friendship, “memes” and perceived desire to help people’s mental health. The pandemic has already done so much to disconnect people from real life socialization and depersonalize people through the internet. The purpose of Snapchat is to connect with friends, not with software. As the bot said in its own (programmed) words: “Nothing can replace the joy and connection of real-life friendships.”