In July, Meta, the parent company of Facebook and Instagram, took action against a significant number of pieces of content that violated their policies in India. They removed over 15.8 million pieces of content on Facebook and more than 5.9 million on Instagram. They also received reports from users and provided tools to resolve issues, with a total of 5,392 cases on Facebook and 5,102 on Instagram.
However, in some cases, specialised review was needed, and Meta reviewed around 25,306 reports for Facebook and 15,044 reports for Instagram. They took action on 5,392 reports for Facebook and 4,635 for Instagram. The remaining reports were reviewed but not necessarily acted upon.
These actions were taken to comply with India's IT (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021, which require digital and social media platforms with more than 5 million users to publish monthly compliance reports. In June, Meta had already removed over 21.8 million pieces of content on Facebook and more than 5.9 million on Instagram.
On another front, WhatsApp, also owned by Meta, actively enforced user safety measures. In July, they banned over 72 lakh (7.2 million) accounts in India. Remarkably, 31,08,000 of these accounts were banned proactively, even before receiving any user reports.
WhatsApp received 11,067 complaint reports in July and took action on 72 of them. "Accounts Actioned" means WhatsApp either banned an account or restored a previously banned one, depending on the situation.
To enhance user protection, India's Centre introduced the Grievance Appellate Committee (GAC), which received five orders between July 1 and July 31. WhatsApp complied with all five orders. This committee aims to address user concerns related to content and other issues on social media platforms, aligning with India's efforts to strengthen digital regulations and protect the rights of its digital citizens.