Meta, a leading name to head a number of soucal media platform is making Instagram safer for teenagers by adding two major features to the direct messaging (DM) section. Now, when a teenager is about to chat with someone, even if both follow each other, Instagram will show safety tips. These reminders ask the user to check the account's profile carefully and not to share anything if something feels wrong.
Furthermore, Instagram will now display the month and year when the other account was created at the top of the chat. This change is meant to help teens better identify suspicious or potentially fake accounts, especially those used by scammers.
Quick Block and Report option introduced
Meta has also improved the way teenagers can protect themselves from unwanted interactions. If a teen wants to block someone in the chat, Instagram will now show a “Block and Report” button in one step. Earlier, blocking and reporting had to be done separately. This small update allows teens to both end and report bad experiences more easily.
Adult-managed child accounts get stricter controls
Meta is extending teen safety protections to adult-managed accounts that feature children. These are typically run by parents or talent managers on behalf of children under 13. Such accounts will now get access to Instagram’s strictest safety settings by default, including:
- Tighter message controls
- Hidden Words filter to block offensive comments
- Updated safety alerts at the top of their Instagram feed
Meta clarified that while adults can run accounts for kids, if the child is found operating the account themselves, it will be deleted.
Focus on online safety for India’s young Instagram users
With India being one of Instagram’s largest markets, this move is expected to significantly improve the online experience for teenagers and children. As young users increasingly engage with social media, Meta’s new features aim to create a safer, age-appropriate space, especially for Indian families navigating digital platforms.