Snapchat needs to remove harmful ‘My AI’

Alayna Pellegrin

Snapchat’s “My AI” seems more like a horror movie villain than a friendly robot. 

Last week, Snapchat partnered with ChatGPT to release My AI, an artificial intelligence service that chats with users. Snapchat states the chatbot can “answer burning trivia questions, offer advice on the perfect gift for your BFF’s birthday or help plan a hiking trip for a long weekend.”

Yet, many critics have safety concerns about Snapchat’s new service. The AI can be manipulated into spreading misinformation and saying offensive slurs. Snapchat should not have released the chatbot until it was 100% safe for all users.  

According to Snapchat CEO Evan Spiegel, more than 2 million chats per day are happening with My AI. It is easy to wonder how many of these chats contain inappropriate material. 

The Center for Humane Technology focuses on tracking the use of AI in social media apps. After Snapchat released the service, the organization worked with the Washington Post to test the boundaries of My AI. 

At first, the AI adhered to the strict guidelines set by Snapchat. It quickly deterred any talk of drinking, drugs or sex. As the conversation continued, the AI forgot the rules it needed to follow. 

In one case, the bot suggested a 13-year-old girl sneak out to speak with an adult man. Another report showed the AI advising a boy on how to disguise the smell of alcohol. 

Snapchat admitted to My AI spreading harmful information to minors. 

“As with all AI-powered chatbots, My AI is prone to hallucination and can be tricked into saying just about anything,” Snapchat announced in a tweet. “Please be aware of its many deficiencies, and sorry in advance!”

My AI needs to be removed from users’ phones until a permanent solution is found. Currently, My AI cannot be uninstalled from devices.

I am sure we will all be using My AI in the future, but it should not expose children to inappropriate content.