SC flags risks of unregulated online content
Samira Vishwas November 29, 2025 03:24 AM

In this Capital Beat episode, Supreme Court lawyer Sanjay Hegde, independent journalist Prashant Kanojia and senior editor TK Rajalakshmi debated the implications of the Supreme Court hearing on online content regulation. The discussion followed remarks by the Chief Justice of India expressing concern over unregulated user-generated content, in the case involving YouTuber Ranveer Allahbadia. The central issue: whether current freedoms on digital platforms require legal curbs to ensure accountability.

Justice Surya Kant observed, “Why create my own channel? I am not accountable to anyone. Somebody has to be accountable.” That comment set the stage for the debate. Solicitor General Tushar Mehta described the issue as one of perversity rather than mere obscenity, arguing that creators cannot act without boundaries under the guise of free expression. Justice Joymalya Bagchi added that while free speech is protected, content threatening national unity, integrity or sovereignty can justifiably be curtailed. He raised specific questions: if content is perceived as anti-national or disruptive of societal norms, will creators take responsibility, and will self-regulation suffice given the viral speed of online media? The court suggested preemptive measures such as age verification, disclaimers and possible regulatory oversight.

What Hegde says about regulation

Hegde noted that while platforms maintain their moderation policies, attempts at censorship by misuse of those policies — citing copyright claims or hate-speech rules — are common. He stated, “This is a new phenomenon and the contours … have never been clearly delineated.” He warned that shifting control from platform-level moderation to government-level regulation complicates enforcement: “You may be able to block sites or platforms … but then instead of going on TikTok people have gone on Meta.”

He added that although the judiciary’s remarks signal openness to content curation and possible censorship of material deemed “dangerous,” past global attempts to regulate the internet — even in countries where the internet is tightly controlled — have not succeeded. He expressed skepticism: technical mechanisms, VPNs and alternate platforms often enable users to bypass restrictions.

Hegde pointed out that a proposed neutral authority might attempt to regulate the digital space, but questioned its effectiveness across the vast internet. He said, “That is the real danger,” cautioning against an “advance licence and advance blessing” for regulation: moving from punishing individual infractions to imposing broad control over expression.

Kanojia raises fears of censorship, inequality

Kanojia highlighted concerns over the independence of any regulatory authority. “I have a doubt on the independence of such bodies,” he said, noting that many institutions in India labelled “independent” are in practice aligned with the government. For him, the push for a regulator threatens democracy: “It’s gonna be … a very bad thing for the democracy.”

He argued that the internet democratises media, enabling subaltern and marginalised voices — Dalits and minorities among them — to create content outside corporate media’s control. He warned that regulation would dismantle that access. “The moment you start regulating, it slowly and gradually will convert into censorship,” he asserted.

Emphasising past experiences, Kanojia described the risks faced by creators without resources. He recounted personal trauma: police brutality left him with a broken jaw and lasting physical harm. He said the hazard lies not only in legal battles — which a privileged few can navigate — but also in “police brutality” and the ordeal of “the process”. Subaltern creators may struggle under new curbs, while well-connected individuals continue with impunity.

Rajalakshmi questions necessity of extra regulation

TK Rajalakshmi described the court’s tone as “very strange,” expressing concern that the observations reflect a misunderstanding of existing laws. She noted that provisions such as Section 69A of the IT Act, the Copyright Act and relevant IPC sections already allow the government to take down content affecting national security or sovereignty.

Rajalakshmi asked: if those mechanisms operate, what justifies adding a new regulatory apparatus? She emphasised the subjectivity in defining what is “anti-national”: “Who is going to define what is … anti-national? It’s such a subjective thing today.” She warned that ordinary citizens could file FIRs for content they find objectionable, while certain powerful entities continue to spread divisive material without consequence.

She argued that social media platforms and small content-creating channels have the potential to democratise access to information. While acknowledging that not all content may be “politically correct,” she insisted that existing laws suffice for punishing hate speech, divisive or illegal content. Additional regulation, she said, risks exacerbating fear among creators and suppressing free expression.

Shared concerns over chilling effects, unequal power

All panelists agreed on one point: creators today are already cautious about what they upload — especially those who critique political leaders or dominant media narratives. Kanojia noted that many subaltern creators may avoid controversial topics because of fear, while those with resources can weather legal storms.

Hegde warned that what begins as targeted punishment for objectionable content could expand into broad censorship under regulatory authority. He contrasted creative expression — satire, social commentary, dissent — with the existing powers of the state, arguing that a blanket regulatory framework could nullify freedom.

Rajalakshmi highlighted the present atmosphere of “fear psychosis,” where individuals self-censor to avoid being branded anti-national. She questioned the need for more regulation when existing laws are often employed disproportionately already.

Key uncertainties after apex court’s observations

The Supreme Court’s observations stopped short of issuing any binding order or formal directive for an online content regulator. The panel agreed that while the remarks indicate highlighting a problem, they fall short of laying down a framework.

Whether the court or the government will move to establish a neutral regulatory authority — and how such an authority would function — remains unclear. The scope and definition of “dangerous content,” “anti-national” sentiment or community standards were not detailed. No measures or procedural safeguards were outlined during the hearing.

Consequently, creators, journalists and ordinary users remain in a state of uncertainty about the future of online expression. Many fear that continued calls for regulation may culminate in expanded censorship under vague standards, penalising dissent and minority voices while powerful actors continue unchecked.

The content above has been transcribed from video using a fine-tuned AI model. To ensure accuracy, quality, and editorial integrity, we employ a Human-In-The-Loop (HITL) process. While AI assists in creating the initial draft, our experienced editorial team carefully reviews, edits, and refines the content before publication. At The Federal, we combine the efficiency of AI with the expertise of human editors to deliver reliable and insightful journalism.

© Copyright @2025 LIDEA. All Rights Reserved.