Centre’s online content blocking orders double to 24,000 in a year, over half on X
360° Perspective Analysis
Deep-dive into Geography, Polity, Economy, History, Environment & Social dimensions — AI-powered, on-demand
Context
The Ministry of Electronics and Information Technology (MeitY) has increased its online content blocking orders, which reached nearly 7,000 annually in recent government reports (2022-2023)., driven by concerns over misinformation, public order, and sovereignty, with recent emphasis on the rising threat of AI-generated deepfakes. Over 60% of these takedowns targeted the platform X (formerly Twitter), prompting the government to increasingly rely on emergency takedown clauses. To handle this unprecedented volume, inter-ministerial discussions are currently underway to decentralize IT Act blocking powers to other key ministries like Home Affairs and Defence.
UPSC Perspectives
Polity & Governance
The legal mechanism for online censorship is rooted in Section 69A of the Information Technology (IT) Act, 2000, which empowers the Centre to block public access to information based on reasonable restrictions mapped to (protecting sovereignty, integrity, defense, security, friendly relations, and public order). In the landmark , the Supreme Court upheld Section 69A's constitutionality specifically because of its procedural safeguards, such as written reasons and pre-decisional hearings. However, there are rising concerns regarding transparency in the current surge of blocks. Rule 16 of the mandates strict confidentiality of these government orders, meaning users are often completely unaware of why their content was removed. For UPSC Mains, this highlights the tension between national security imperatives and a citizen's ability to seek judicial review against potential administrative overreach.
Internal Security
The proliferation of synthetic media and coordinated disinformation poses severe threats to national security, acting as a force multiplier during critical periods like border escalations or military standoffs. To counter immediate threats without delay, the government is heavily utilizing the emergency clause of the , which allows a Designated Officer to issue interim blocking orders before the inter-ministerial committee can formally convene. Furthermore, The Ministry of Home Affairs and other nodal agencies act as the primary 'requesting authorities' who send content removal requests to MeitY's designated officer for review under Section 69A. While this ensures rapid crisis response in the digital domain, bypassing the centralized technical oversight of nodal bodies like raises concerns about fragmented internet governance and the potential misuse of emergency powers for political censorship.
Science & Technology
The sheer volume of blocked URLs—over half originating on X—highlights a severe regulatory gap in managing generative AI and deepfakes. Global tech companies operate in India under the safe harbor principle found in of the IT Act, which protects them from liability for third-party content provided they act quickly upon government takedown notices. However, the exponential rise in hyper-realistic deepfakes (such as manipulated videos of MPs) necessitates a paradigm shift from reactive government takedowns to proactive algorithmic accountability by the platforms themselves. As the MeitY Blocking Committee now convenes virtually multiple times a week just to review AI-generated posts, it underscores the urgent need for a comprehensive AI legislative framework in India to address the malicious generation of synthetic content, rather than relying solely on 20-year-old traditional content blocking mechanisms.