/vnd/media/media_files/2026/02/25/indias-it-rules-2026-2026-02-25-16-36-22.jpg)
India has amended the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021, with changes effective from 20 February 2026, marking a shift in how artificial intelligence (AI) and online platforms are governed.
The 2026 amendment follows a rise in deepfakes and AI-generated content and misinformation globally, prompting governments to recalibrate regulatory frameworks to balance innovation with user protection. According to the official Gazette notification, the amendment introduces definitional clarity, expands due diligence obligations, strengthens transparency requirements and shortens grievance redress timelines for intermediaries.
A central feature of the amendment is the formal recognition of “Synthetically Generated Information” (SGI), defined as digitally created or algorithmically modified audio, visual or audiovisual content, with or without accompanying audio, that cannot be distinguished from real-life material.
This includes deepfake videos, AI-generated voice imitations and artificially created images. The definition excludes routine editing processes such as colour correction, accessibility enhancements and educational or illustrative material, thereby protecting legitimate digital creativity.
2026 IT Amendment Clarifies SGI Norms
The amendment introduces a formal definition of SGI covering audio, visual or audiovisual content created, generated, modified or altered through any computer resource. Under Rule 21(A), SGI is deemed “information” wherever unlawful information is referenced, bringing synthetic content within the scope of existing compliance provisions.
Rule 2(1B) clarifies that automated removal or disablement of content, when undertaken in compliance with the Rules, does not violate Section 79(2) of the IT Act, addressing earlier ambiguity around safe harbour protections.
Intermediaries are required to inform users every three months about their rights, liabilities, penalties and enforcement actions, including specific warnings related to SGI. Takedown timelines have been tightened to three hours for designated urgent notices and two hours for certain defined harmful content. Grievance redress timelines have been reduced from 15 days to seven days, with some cases requiring resolution within 36 hours.
Rule 4(4), which earlier required platforms to endeavour to deploy technology-based measures, has been strengthened to mandate deployment of appropriate technical measures. Platforms must also disclose the identity of SGI violators to victims or their authorised representatives under lawful process. References to criminal law have been updated to align with the Bharatiya Nyaya Sanhita 2023.
Due Diligence Norms for Synthetic Content
The application of Rule 3(3) imposes enhanced obligations on intermediaries that provide tools to create or alter SGI. Such entities must implement reasonable and proportionate technical and operational controls, which may include automated systems, to prevent the dissemination of unlawful synthetic content. This covers material relating to child sexual exploitation, non-consensual intimate imagery, falsified electronic records, extremist content, depictions of explosives or weapons, and malicious deepfake impersonation.
Platforms must ensure that SGI carries clear on-screen disclosures for visual content and predefined audio identifiers for audio material. Permanent metadata incorporating a unique identifier traceable to the intermediary’s system must be embedded in such content. This metadata must remain tamper-resistant and not be alterable by users, even if the content is downloaded or modified.
Stricter Compliance for Social Platforms
Significant Social Media Intermediaries are subject to additional obligations. Under Rule 4(1A), they must require users to indicate whether content is synthetically generated, verify such declarations through automated processes and classify and label SGI appropriately before publication. Failure in the pre-publication process may be treated as a lapse in due diligence, potentially affecting safe harbour protections under Section 79 of the IT Act.
The amendment also shortens response times for removal requests and law enforcement orders. Urgent takedown notices that previously required action within 36 hours must now be addressed within three hours, while certain defined harmful content must be removed within two hours. Grievance response time has been reduced from 15 to seven days, with specific cases requiring disposal within 36 hours.
Requests from law enforcement must originate from officers of at least Deputy Inspector General rank and be authorised in writing, introducing procedural structure to enforcement actions. In practice, these timelines require platforms to maintain 24/7 trust and safety teams with real-time monitoring capabilities and immediate alert systems.
Sectoral Implications Across the Ecosystem
India’s IT Rules 2026 (the amendment) reshapes compliance expectations across digital sectors.
AI and content creation platforms will need to establish watermarking systems at the point of generation, build immutable metadata infrastructures carrying persistent unique identifiers, deploy classifiers capable of detecting synthetic content across formats, and maintain monitoring pipelines supported by automated processes. While this may require increased investment, particularly from smaller firms, it also provides clearer regulatory standards within which product development can proceed.
Social media businesses will need to overhaul upload workflows, labelling processes, user interface layers and backend verification systems to accommodate mandatory declarations and pre-publication checks for SGI. They must also implement identity disclosure protocols enabling victims affected by SGI misuse to obtain the identity of alleged violators through due process.
For news media, advertising agencies and the entertainment industry, the Rules necessitate safeguards around election-related and politically sensitive material, particularly during periods of heightened political activity. Editorial and content review mechanisms will need to ensure that synthetic elements are disclosed and that unlawful or misleading material is not disseminated.
Start-ups are likely to face increased compliance costs as a result of the technical and procedural requirements. However, the structured definition of SGI, clarified due diligence norms and codified timelines reduce regulatory uncertainty and delineate clearer operational boundaries for innovation.
India’s amended IT framework signals a move towards mandatory transparency, labelling and accelerated grievance redress in AI governance. The implementation phase will test whether platforms can align engineering architecture, operational processes and legal compliance within compressed timelines while sustaining user trust.
/filters:format(webp)/vnd/media/media_files/2026/02/25/ananay-jain-2026-02-25-16-16-44.jpeg)
The author is a Partner and national media and entertainment industry Leader at Grant Thornton Bharat.
(The cover image accompanying this story was created using AI-based tools.)
/vnd/media/agency_attachments/bGjnvN2ncYDdhj74yP9p.png)