Contact Us
February 17, 2026 • By Prachi Mantri

India's AI Regulations Are Here: What Every Founder and Business Owner Needs to Know

If you're a founder building with AI — or even just using ChatGPT for your marketing, AI tools for customer support, or automated decision-making anywhere in your operations — India's regulatory landscape is changing fast.

Three separate regulatory developments between November 2025 and February 2026 are creating an enforceable framework that every business using AI needs to understand. Not eventually. Now.


1. IT Rules Amendment — Takes Effect February 20, 2026

This is the one that carries immediate teeth.

The amended IT (Intermediary Guidelines and Digital Media Ethics Code) Rules will specifically regulate Synthetically Generated Information (SGI) starting February 20, 2026 — any AI-created or AI-altered content that appears indistinguishable from real content. This includes deepfakes, AI-generated images, AI voice cloning, and synthetic video.

What your business must comply with:

Mandatory Labeling: If your business creates or publishes AI-generated content — social media posts, marketing materials, product images, customer-facing communications — it must carry a visible label identifying it as AI-generated. Audio content needs a prefixed disclosure. Metadata must be embedded for traceability.

Platform Obligations: If you run a platform with over 5 million users, you're classified as a Significant Social Media Intermediary (SSMI). You must:

Faster Takedowns: The takedown timeline for illegal synthetic content has been compressed dramatically:

Content Type New Takedown Window Previous Window
Court/government ordered content 3 hours 24–36 hours
Non-consensual deepfake imagery 2 hours 24–36 hours
General grievances 7 days 15 days

The Risk: Failure to comply means losing Safe Harbour protection under Section 79 of the IT Act. Without safe harbour, your platform becomes directly liable for user-generated content. For a startup, that's an existential risk.

What's exempt:

Routine editing (colour correction, compression, translation) that doesn't alter meaning. Clearly hypothetical or illustrative content. But if there's any ambiguity — label it.


2. DPDP Act Rules — Notified November 2025, Phased Compliance till May 2027

The Digital Personal Data Protection Act, 2023 received its operational rules on November 14, 2025, with phased compliance timelines extending till May 2027. For AI-driven businesses, the implications are significant.

If you process personal data through AI (and most startups do):

What You Must Do Why It Matters
Obtain explicit consent Your AI model training on user data? You need clear, informed consent for it
Honour data principal rights Users can demand access, correction, and deletion of their data — even if it's been fed into your model
Follow purpose limitation Data collected for one purpose (e.g., customer support) can't be repurposed for another (e.g., ad targeting) without fresh consent
Protect children's data No tracking, no behavioural monitoring, no targeted ads for children. Parental consent required
Report breaches in 72 hours If personal data is compromised, you must notify affected individuals and the Data Protection Board within 72 hours

The Significant Data Fiduciary question:

If your startup processes high volumes of personal data or deals with sensitive categories, you could be classified as a Significant Data Fiduciary (SDF). SDFs face additional obligations:

Even if you're a 10-person startup, the classification depends on volume and sensitivity of data, not company size.


3. India AI Governance Guidelines — The Benchmark That's Not Yet Law (But Will Be Treated Like It)

Released on November 5, 2025 by MeitY under the IndiaAI Mission, these guidelines lay down 7 foundational principles (called "sutras") for responsible AI:

  1. Trust as the Foundation — build trust across your entire AI value chain
  2. People First — maintain meaningful human oversight over AI decisions
  3. Innovation over Restraint — innovate, but manage risks proportionately
  4. Fairness and Equity — test and design AI systems to avoid bias
  5. Accountability — assign clear responsibility to developers, deployers, and users
  6. Understandable by Design — build in transparency and explainability
  7. Safety, Resilience, and Sustainability — ensure robustness and environmental responsibility

Why founders should care even though these are "voluntary":

These guidelines set the benchmark that sector regulators (RBI for fintech, SEBI for capital markets, IRDAI for insurance, ICMR for healthcare) will reference when framing their own binding rules. If you're raising capital, your investors' legal teams will increasingly evaluate you against these principles.

Three new institutional bodies are being established:

India has deliberately chosen not to enact a standalone AI law. Instead, it's building a layered framework using existing legislation (IT Act, DPDP Act, Consumer Protection Act, Bharatiya Nyaya Sanhita) with targeted amendments. The advantage for businesses: you don't need to learn an entirely new legal regime. The challenge: you need to understand how multiple laws interact.


The IndiaAI Mission — Infrastructure You Can Use

While regulations define what you must do, the IndiaAI Mission (approved by the Union Cabinet on March 7, 2024, with a ₹10,371.92 crore outlay over 5 years) defines what you can access.

If you're a startup or academic institution, this is directly relevant:

Resource What's Available
Compute 38,000+ GPUs in a national facility; projected to reach 1,00,000 by end of 2026
Subsidized Rate ₹65/hour — significantly below global market rates
GPU Options Intel Gaudi 2, AMD MI300X, NVIDIA H100, H200, A100, AWS Tranium
AIKosh Platform 9,800+ datasets and 273 AI models across 20 sectors
Language Data Speech data across 12 Indian languages via Bhashini
Portal indiaai.gov.in — applications, datasets, news, resources

If compute costs have been a barrier for your AI development, this changes the equation.


Your Compliance Checklist — What to Do This Week

As practitioners who advise startups and businesses on regulatory compliance, here's what we recommend:

Immediate (this week):

Short-term (this quarter):

Medium-term (2026):


The Bottom Line

India's AI regulations are no longer theoretical. The DPDP Act rules are already notified, the IT Rules for synthetic content take effect on February 20, and the governance guidelines are setting the standard regulators will follow. The timeline is clear and published.

For founders and business owners, the window to get ahead of compliance is now — not after a notice, not after a customer complaint, not after an investor flags it in due diligence.

The businesses that treat AI governance as a competitive advantage — not just a compliance checkbox — will be the ones that build lasting trust with customers, investors, and regulators.


The author is a Practising Company Secretary at PMNCO, working with founders and businesses on corporate compliance and regulatory advisory. For questions on how these frameworks apply to your business → pmnco.co.in