Verifiable Parental Consent Under DPDP Rules 2025: The New Compliance Frontier for EdTech, Gaming & Social Apps
The DPDP Rules 2025 require verifiable parental consent before processing any data of a user under 18. Here is what EdTech, gaming, and social platforms must build before May 2027.
The Under-18 Problem No Indian Platform Can Ignore
When the Digital Personal Data Protection (DPDP) Rules 2025 were notified on 13 November 2025, one provision sent product and legal teams scrambling more than any other: Rule 10 — Verifiable Parental Consent.
The DPDP Act classifies everyone under the age of 18 as a "child." That is a far wider net than most global frameworks. Under the GDPR, the digital age of consent can be as low as 13. Under India's law, an 17-year-old signing up for a coding course, a multiplayer game, or a social app is legally a child — and you cannot process their personal data until you have obtained verifiable consent from their parent or lawful guardian.
For EdTech, gaming, social media, and consumer apps with young user bases, this is not a checkbox. It is an architectural change.
What "Verifiable" Actually Means
The word doing all the heavy lifting in Rule 10 is verifiable. It is not enough to show a tick-box that says "I am a parent." A Data Fiduciary must take reasonable steps to confirm two things:
- Age — that the person giving consent is genuinely an adult.
- Identity and relationship — that this adult is in fact the parent or lawful guardian of the child.
The Rules offer one pragmatic path: if the parent is already a verified user of your platform, you may rely on the age and identity details they previously provided. The Rules also contemplate the use of voluntarily provided identity documents or virtual tokens mapped to a verified identity (for example, through a Digital Locker service).
Key Insight: The cleanest compliance posture is to treat parental consent as a linked-account problem, not a pop-up problem. The parent has an identity. The child's account is attached to it. Consent flows from the verified adult to the dependent profile.
Where Platforms Will Get Tripped Up
1. Age Assurance Is Genuinely Hard
There is no perfect, privacy-friendly, universally accepted age-verification mechanism — not in India, not anywhere. Self-declared birthdates are trivially gamed by minors. Document uploads create a fresh pile of sensitive data you now have to secure. The honest answer is that you will need a layered approach: self-declaration plus behavioural signals plus a verification step that escalates only when needed.
2. The "Education" Ambiguity
The Rules carve out limited exemptions — for example, processing strictly necessary for the safety of a child, or by clinical and healthcare establishments. There is also language around educational institutions. But the boundary of "education" is not crisply defined. Does it cover only formal schools, or also EdTech platforms and private tutors? Until the Data Protection Board clarifies, EdTech companies should assume the strict reading: get verifiable parental consent.
3. No Tracking, No Targeted Ads — Full Stop
Beyond consent, the Act prohibits behavioural tracking and targeted advertising directed at children. A gaming app cannot quietly profile a 15-year-old to serve personalised offers. This affects monetisation models, not just sign-up screens.
A Practical Build Checklist
- Add an age gate at account creation — and design it so that an under-18 result routes into a parental-consent flow rather than a dead end.
- Build a parent-linked consent record — store which adult consented, when, for what purposes, and how they were verified. This is your audit trail.
- Make withdrawal symmetric — a parent must be able to revoke as easily as they consented.
- Disable tracking and ad personalisation for any profile flagged as a child.
- Localise the consent request — parents across India will read these notices in 22 different languages. An English-only flow is not "informed" consent.
- Log everything immutably — when the Board asks how you obtained consent for a specific child, "we think they ticked a box" is not an answer.
The Timeline Is Shorter Than It Looks
The substantive obligations of the DPDP Act — including consent and children's data rules — become enforceable on 13 May 2027. Eighteen months sounds generous until you realise that age assurance, parent-linking, and consent-record infrastructure all need design, build, legal review, and testing. Platforms that start in 2027 will be retrofitting under pressure. Platforms that start in 2026 will simply be ready.
How Consently Helps
Consently's consent management platform is built DPDPA-first. That means parental-consent flows, granular purpose-based consent, symmetric withdrawal, immutable audit logs, and consent notices in all 22 Schedule 8 languages are part of the core product — not a bolt-on. Our zero-PII Consent ID system lets you prove a valid consent exists for a given profile without warehousing more sensitive identity data than you need.
If your platform has users under 18 — and most consumer apps in India do — verifiable parental consent is the compliance project to start now. Talk to us about getting your consent layer ready well before May 2027.