Snapchat Empowers Parents with Enhanced Controls and Safety Features

Snapchat has unveiled a set of new parental controls aimed at providing parents with greater oversight and control over their children’s interactions on the platform. These updates include restrictions on the app’s AI chatbot, enhanced visibility into privacy settings, and streamlined access to the Family Center—a dedicated space for parental controls.

One notable feature is the ability for parents to restrict their children from interacting with My AI, Snapchat’s AI-powered chatbot. This move follows criticism the platform faced after the chatbot was found engaging in inappropriate conversations with minors. The new parental control feature adds an extra layer to My AI’s existing safeguards, including protections against harmful responses, temporary usage restrictions for misuse, and age-awareness measures.

In addition to chatbot restrictions, parents can now view their children’s safety and privacy settings. This includes insights into whether their teen can share Stories with a wider audience or a select group and control over who can contact their child on the app by managing contact settings. Furthermore, parents can monitor their teen’s location-sharing activity on the Snap Map.

Snapchat emphasizes its commitment to real-world relationship dynamics between parents and children, where parents have insights into their children’s digital interactions while respecting the privacy of personal communications. The Family Center, launched in 2022 in response to growing pressure on social networks to protect young users globally, continues to evolve with additional features based on user feedback and collaboration with online safety experts.

These updates come ahead of Snapchat CEO Evan Spiegel’s scheduled testimony before the Senate on child safety, where he will join executives from other major platforms like X (formerly Twitter), TikTok, Meta, and Discord. The committee is expected to address concerns about the platforms’ ability to protect children online.

Notably, Snapchat’s move aligns with industry-wide efforts, as both Snap and Meta received formal requests for information from the European Commission regarding steps taken to protect young users on their social networks. Similar requests were sent to TikTok and YouTube, reflecting the increasing focus on child safety in the digital space.

Snapchat’s commitment to providing parents with effective tools for managing their teens’ online experiences mirrors recent efforts by Meta, which introduced new content limitations on teen Instagram and Facebook accounts. These limitations automatically restrict access to harmful content related to self-harm, graphic violence, and eating disorders.

Image credit: Unsplash

Share this article
Shareable URL
Prev Post

Empowering Women: Diem’s Journey to Redefining Online Conversations

Next Post

Apple Vision Pro Pre-Orders Introduce Face ID Scan for Precise Fit

Leave a Reply

Your email address will not be published. Required fields are marked *

Read next