DPDP Rules 2025: Operational Detail with a Pro‑State, Pro‑Big‑Platform Tilt

The Rules translate the DPDP Act’s broad principles into operational detail on consent, security, breach notification, retention and institutional functioning, with a clear “digital‑first” design for the Board, appeals and payments. They also introduce structurally novel ideas like Consent Managers. At the same time, they retain wide policy levers with the Central Government such as the ability to define localisation categories, set conditions on cross‑border flows and call for information while relying heavily on internal standards and executive discretion rather than independent oversight. In effect, the framework is modern and granular, but its primary optimisation is for administrative manageability and State flexibility, not for strict data minimisation or strong external checks. 

Rights, transparency and Consent Managers 

On paper, the Rules advance transparency and user rights: Data Fiduciaries must issue clear, stand‑alone notices, publish contact details of a Data Protection Officer (or equivalent), and explain how Data Principals can exercise rights or file grievances within a 90‑day outer limit. But there is no concrete requirement for multi‑lingual notices, accessibility for persons with disabilities, or usability testing for low‑literacy audiences, limiting how “informed” consent will be across India’s diversity. The Consent Manager framework is conceptually strong interoperable platforms with strict neutrality, conflict‑of‑interest, security and audit duties could give users a single powerful control surface over consent. Yet entry barriers (Indian incorporation, ₹2 crore net worth, heavy governance constraints) and wide Board discretion over registration, suspension and cancellation mean that only a handful of large players are likely to qualify, raising concentration risks and scope for regulatory capture. 

Security, breach notification and retention 

Security is one of the clearest strengths. The Rules mandate the use of encryption/obfuscation/masking or tokenisation, access controls over computer resources, logging and monitoring, backups and continuity mechanisms, and explicit contractual safeguards for processors. Breach notification is also reasonably robust: affected Data Principals must be informed “without delay” with actionable information, and the Board must be given a detailed report within 72 hours (extendable). The problems lie in blunt baselines and overload. A mandatory minimum one‑year retention of logs and personal data for all processing, plus three‑year inactivity‑based retention by very large e‑commerce, gaming and social‑media platforms, weakens the principle of storage limitation and increases long‑term exposure to breaches, even where risks and business needs are minimal. Conversely, the lack of a risk threshold for breach notifications means even low‑impact incidents must be reported, which can create “alert fatigue” and devalue genuinely serious warnings.

Children, persons with disabilities and carve‑outs 

For children and for persons with disabilities whose affairs are managed by lawful guardians, the Rules require “verifiable consent” grounded in reliable identity and age data or official virtual tokens (e.g. DigiLocker). This is more defensible than superficial age‑gates and correctly ties disability guardianship to sectoral laws. But it also drives more collection and linkage of highly sensitive identity data, amplifying privacy and security risks, and risks excluding real caregivers who lack formal guardianship documents, particularly in poorer contexts. More significantly, the Act’s child‑protection safeguards are weakened by wide exemptions. Educational institutions, healthcare and childcare settings, and transport providers may conduct “tracking and behavioural monitoring” in the name of safety or education, and there are functional carve‑outs for email account creation, real‑time location tracking, harmful‑content filtering and age confirmation. These terms are loose and easily stretched, risking normalisation of pervasive surveillance of children with limited independent oversight or clearly enforceable limits. 

 5. State processing, research exemption and cross‑border flows 

The Rules wrap the Act’s broad State exemptions in a generic standards framework: State processing must be lawful, purpose‑limited, data‑minimised to what is necessary, subject to quality controls, retained only “as required”, and protected by safeguards, with “accountability” resting on those who determine purposes and means. However, there is no requirement for independent data‑protection impact assessments, deletion schedules or third‑party audits for large public databases (welfare, policing, surveillance), so these standards are effectively self‑policed. The blanket exemption for processing “necessary for research, archiving or statistical purposes”, subject only to the same generic standards, contains no explicit requirements for anonymisation, de‑identification, ethics review or re‑identification safeguards, allowing long‑term secondary use under a research label. Cross‑border transfers are formally liberal data may flow anywhere but the Central Government can impose conditions whenever data is made available to foreign States or their agencies, and there is no baseline requirement for “equivalent protection” in destination countries, leaving individuals’ rights abroad to ad hoc executive decisions. 

 Institutional design, enforcement and overall balance 

The Data Protection Board is set up as a digital‑first, tribunal‑like body with defined procedures, quorum and timelines, and appeals lie digitally to TDSAT. This can improve speed and reduce paper‑based friction but may disadvantage people with poor digital access. More importantly, the Board’s Chairperson and Members are chosen by committees dominated by senior civil servants, and its officers are largely on deputation from Government or public sector entities, limiting structural independence from the executive at precisely the moment when the State is also a major, privileged data processor. Significant Data Fiduciaries face additional duties, annual impact assessments and audits, some algorithmic “due diligence” and possible category‑based localisation but these are vaguely framed (especially on algorithmic transparency) and enforced mostly vertically (towards the Board), not via public accountability. Overall, the Rules do meaningfully operationalise the DPDP Act and introduce some forward‑looking mechanisms, yet they plainly prioritise administrative convenience, State flexibility and large‑entity manageability over stringent minimisation, strong checks on State use and independently overseen protection of individuals’ data rights.

The Digital Personal Data Protection Rules, 2025 are a serious, technically aware effort to make the DPDP Act work in practice. They do well on specifying security, breach reporting, consent flows and a “digital‑first” enforcement architecture. But the overall balance is skewed: they keep major levers with the Central Government, are easier for large, well‑resourced entities to comply with than for smaller ones and leave State processing and “research” uses comparatively weakly checked. The result is a modern-looking framework that improves order and manageability in India’s data ecosystem but stops short of building a genuinely rights‑maximising, independently overseen regime. 

Source: https://www.meity.gov.in/static/uploads/2025/11/53450e6e5dc0bfa85ebd78686cadad39.pdf

Authored by: Adv. Sriman Mishra