Instagram Introduces New Safety Features to Protect Children
In response to growing pressure from child safety advocates and Congressional hearings, Instagram has introduced a series of changes aimed at making its platform safer for children. These changes come after years of criticism and legal challenges over Instagram’s role in exposing young users to harmful content and online predators.
One of the most significant changes is that all teen accounts will now default to private. Teens will no longer have the option to make their accounts public, and they will only be able to receive messages from people they already follow. This measure is intended to limit unwanted interactions from strangers, a problem that has previously led to cases of cyberbullying and predatory behavior.
Another critical update is that teens under the age of 15 will need parental permission to change their privacy settings. Offensive words and inappropriate content, including anything deemed graphic or not age-appropriate, will now be filtered out automatically, giving parents and guardians an extra layer of security for their children online. Additionally, Instagram will notify teenagers if they have been scrolling for more than an hour, encouraging healthier usage habits.
“Meta has announced sweeping changes to how kids and teens use Instagram. The company today unveiled Teen Accounts. It's a series of new features aimed at boosting child safety. But these new features come after years of criticism and even lawsuits over how Instagram can lead to a host of dangers,” a spokesperson for the platform explained.
In addition to the automatic privacy settings, parents will now have the ability to monitor their children's online activities more closely. They will be able to see how much time their teens are spending on Instagram, who is messaging them, and who their teens are following.
“Parents are now going to have the ability to see how much time their teens are spending online, who's messaging them, who's following them, who they're following,” Meta shared in a statement.
These changes come in the wake of mounting lawsuits against Meta, Instagram's parent company. Several states have filed legal actions accusing Instagram of contributing to issues like cyberbullying, self-harm, and other mental health concerns among teens. The app has also faced backlash for being a hotspot for predators targeting young users.
"Meta is facing lawsuits from dozens of states over things like cyberbullying, self-harm linked to Instagram. The app has been accused of fueling the teen mental health crisis and making kids addicted to the app. And, you know, Instagram has also been seen as sort of a breeding ground for sexual predators," an expert pointed out.
While these new features are a step in the right direction, ongoing vigilance from both tech companies and parents is essential to keep children safe in the digital world.