The Ministry of Electronics and Information Technology’s (MeitY) latest advisory underscores a familiar regulatory approach: placing the primary burden of policing online speech squarely on intermediaries, while retaining the threat of criminal liability if they fall short. Although the advisory reiterates existing legal duties rather than introducing new rules, its cumulative effect adds to the compliance load platforms already face under India’s expanding digital governance framework.
For large platforms, this translates into sustained investments in moderation systems, automated tools, and grievance redressal infrastructure. However, the challenge becomes more complex for global platforms with permissive content policies. Services such as Reddit, which allow not-safe-for-work (NSFW) content within clearly labelled communities and rely heavily on community-level moderation, will have to assess how these decentralised structures align with India’s expectations under intermediary law.
X faces a similar tension. The platform’s public policies permit consensual adult content, subject to labelling and visibility controls, rather than a blanket prohibition. As a result, enforcement under Indian law will depend not only on platform rules but also on how consistently platforms review content once authorities or users flag it or otherwise bring it to the company’s attention.
More broadly, the advisory reflects a continued shift of responsibility from the state to private platforms. This pattern also emerged in MeitY’s Non-Consensual Intimate Imagery (NCII) Standard Operating Procedure (SOP), which drew criticism for concentrating on takedown timelines, already among the strictest globally, while offering less clarity on improving investigative capacity, dealing appropriately with victims, or streamlining coordination among law enforcement agencies.
As the government leans further on intermediary compliance, the unresolved question remains whether expanding platform obligations can meaningfully address enforcement gaps without parallel institutional reform.
MeitY has asked intermediaries, including social media platforms, to curb the hosting and circulation of obscene, pornographic, vulgar, and other unlawful content on their platforms, through an advisory dated December 29, 2025, seen by MediaNama. Issued by MeitY’s Cyber Laws Division, the advisory reminds platforms of their statutory due diligence obligations under the Information Technology Act, 2000 (IT Act), and the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 (IT Rules).
The ministry said it has, over time, received reports and representations through public discourse, stakeholder submissions, and judicial observations indicating that certain categories of content circulating on intermediary platforms may not comply with existing laws on decency and obscenity. Consequently, these instances have raised concerns across sections of society about the responsible use of digital platforms. Moreover, MeitY noted that such concerns have featured in parliamentary discussions and court proceedings, while some specific cases have also been referred to law enforcement agencies for action.
Furthermore, the ministry said there is a need for “greater consistency and rigour” in how intermediaries identify, report, and expeditiously remove content that is obscene, indecent, vulgar, pornographic, paedophilic, harmful to children, or otherwise unlawful, as prescribed under existing law. Accordingly, the advisory reiterates intermediaries’ obligation to observe due diligence as a condition for retaining liability exemptions under the IT Act.
The ministry reminded intermediaries that compliance with due diligence obligations remains a precondition for availing safe harbour under Section 79 of the IT Act. Accordingly, platforms must make reasonable efforts to ensure that users do not host, display, upload, modify, publish, transmit, store, update, or share any information that is obscene, pornographic, paedophilic, harmful to children, vulgar, indecent, sexually explicit, or otherwise unlawful. Furthermore, MeitY stressed that intermediaries must act expeditiously once they receive actual knowledge of unlawful content.
In addition, the advisory stated that intermediaries must not permit, in any manner whatsoever, the hosting, display, uploading, publication, transmission, storage, or sharing of content prohibited under any law in force. The ministry warned that failure to observe these obligations would result in the loss of liability exemption under Section 79 of the IT Act and could attract consequential action under the IT Act and the Bharatiya Nyaya Sanhita (BNS), 2023.
MeitY also highlighted operational obligations. All intermediaries must deploy accessible reporting and grievance redressal systems. Meanwhile, significant social media intermediaries must additionally deploy technology-based measures, including automated tools, to proactively prevent the dissemination of such unlawful content and ensure timely compliance with removal requirements. Separately, the advisory reiterated the 24-hour takedown requirement for content that depicts or impersonates an individual in any sexual act or conduct upon receiving a complaint.
Finally, the ministry advised intermediaries to immediately review their internal compliance frameworks, content moderation practices, and user enforcement mechanisms. It also pointed to penal provisions under Sections 67, 67A, and 67B of the IT Act; the BNS, 2023; the Indecent Representation of Women (Prohibition) Act, 1986; the Protection of Children from Sexual Offences Act, 2012; and the Young Persons (Harmful Publications) Act, 1956, warning that non-compliance could result in prosecution under these laws.
Earlier in 2025, the government took several regulatory actions to curb obscene and vulgar content online. In February 2025, the Ministry of Information and Broadcasting (MIB) urged both over-the-top (OTT) platforms and social media intermediaries to comply with content moderation norms and adhere to the Code of Ethics under the IT Rules, following complaints about sexually explicit and vulgar material circulating online.
In July 2025, MIB issued blocking orders against 25 OTT platforms that were found to be streaming obscene content. These included services such as Ullu, ALTT, and others, with internet service providers being instructed to disable access to the corresponding websites.
Moreover, Union Minister for Electronics and IT, Railways, and I&B, Ashwini Vaishnaw, confirmed in Parliament that as of late July, 43 OTT platforms had been blocked for similar reasons, underscoring the scale of the crackdown on online platforms failing to moderate content deemed obscene or harmful.
Furthermore, in November 2025, MeitY issued an SOP to combat the dissemination of NCII, setting out clear mechanisms for victims, intermediaries, and different law enforcement agencies to remove such content within 24 hours of reporting.
Editorial Context & Insight
Original analysis & verification
Methodology
This article includes original analysis and synthesis from our editorial team, cross-referenced with primary sources to ensure depth and accuracy.
Primary Source
MEDIANAMA