In December 2023, Meta, the parent company of Facebook and Instagram, reported significant content moderation actions in India. The tech giant removed over 19.8 million pieces of content on Facebook and 6.2 million on Instagram, adhering to respective platform policies.
What Happened? During the month, Facebook received 44,332 reports through India’s grievance mechanism. Meta provided resolution tools in 33,072 cases, including channels for reporting specific content violations, data download options, and support for hacked account issues. This action aligns with the IT (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021. For the remaining 11,260 reports requiring specialized review, Meta actioned 6,578, leaving 4,682 reviewed but not actioned.
Instagram faced a similar scenario, receiving 19,750 reports through the grievance mechanism. Here, Meta resolved issues in 9,555 cases. The other 10,195 reports underwent specialized review, resulting in actions on 6,028 cases, with 4,167 reviewed but not actioned.
Compliance with the new IT Rules 2021 mandates monthly reporting by large digital and social media platforms, reflecting its commitment to content regulation. Meta emphasizes that actions against content violations may include the removal or covering of disturbing images or videos with warnings.
This proactive stance in content moderation saw a rise from November, when Meta removed over 18.3 million pieces of content on Facebook and 4.7 million on Instagram under similar policies.
Don't miss a beat on the share market. Get real-time updates on top stock movers and trading ideas on Benzinga India Telegram channel.
© 2024 Benzinga.com. Benzinga does not provide investment advice. All rights reserved.