/vnd/media/media_files/2026/01/20/designing-consent-for-next-gen-2026-01-20-19-02-25.jpg)
As India enters a new era of data protection with the Digital Personal Data Protection Act, 2023 (DPDPA), both businesses and users stand at a critical juncture.
The law requires a profound shift—not only in backend compliance but also in users' everyday experiences with websites and apps. Central to this evolution is the concept of consent: what it means, how it is collected, and how its misuse through “deceptive patterns” endangers both user rights and organisational trust.
The DPDPA establishes consent as one of the primary legal bases for processing personal data. For consent to be valid, it must satisfy five distinct requirements.
- Free: The consent must be given without coercion, pressure, or being bundled with other services. Organisations cannot tie the provision of services to consent for processing that is not necessary for the service’s performance. If refusing consent leads to denial of service for non-essential processing, it cannot be considered freely given.
- Specific: Consent must be granular and purpose-specific. Individuals must understand precisely what data is being processed and for what specific purpose. Blanket consent for multiple purposes is invalid.
- Informed: Data principals must receive clear information about the processing through proper notice. This includes an itemised description of the personal data being processed and the specific purposes for which it will be used in English and 22 regional languages.
- Unconditional: Consent cannot be bundled with acceptance of terms and conditions or tied to other agreements. It must be separate and distinct from other contractual obligations.
- Affirmative Action: Consent must be demonstrated through a positive action. Pre-ticked boxes, inactivity, or silence cannot constitute consent. The individual user must take deliberate action to signify agreement.
Users must also be able to withdraw consent as easily as they give it—a feature often neglected in legacy designs. For children under 18, the law requires child-appropriate communication and verifiable parental consent, with special rules prohibiting the tracking or targeting of children for advertisements.
Every consent request or management tool must also be accessible in regional languages, ensuring that there is no digital divide based on language or ability.
Evolving Risks in Digital Consent Design
Deceptive patterns—often called dark patterns—are now embedded across many digital services, quietly shaping how users respond to consent prompts. These interface and journey-level design tactics steer individuals toward decisions they would not make independently, even though the law is clear about the importance of ensuring autonomy and informed choice.
A common tactic is persistent nudging. Users who decline to share their phone number may face repeated prompts at every login, framed as essential for security even though email authentication is equally effective. This systematic pressure chips away at free choice and falls short of the DPDPA’s requirement that consent be voluntary.
Complex withdrawal journeys are another form of manipulation. Revoking permission for advertising-related data sharing often involves navigating a long sequence of nested menus and multiple confirmation steps. The resulting “consent fatigue” makes withdrawal disproportionately harder than giving consent—contradicting the Act’s mandate that both actions be equally simple.
Defaults also work quietly against the user. A food delivery app may preset birthday-sharing preferences to “share with everyone,” even when the data serves no core functional purpose. Such practices violate the privacy-by-default principle, which requires platforms to minimise data collection unless explicitly needed.
Emotional language is frequently used to influence decisions. Prompts such as “Do not miss out on personalised experiences your friends are enjoying!” or “Are you sure you want to limit your experience?” exploit social anxieties rather than provide factual information. This undermines the fairness and transparency that the DPDPA expects.
Some platforms rely on design camouflage. Key information—such as links to withdraw consent—may technically exist but appear in tiny fonts, low-contrast colours, or obscure screen locations, making them nearly impossible to notice during regular use.
Social proof is another tool in the playbook of deceptive patterns. Messages like “Join 50 million users who have enabled location sharing!” create a manufactured sense of urgency, even when the feature is unnecessary for core functionality. This subtly pressures users into sharing more than they intend.
Right obstruction is perhaps the most direct violation. While consent can often be given with a single tap, withdrawing it may involve verification steps, wait periods, or even a call to customer support.
Some apps include misleading “dead ends,” where privacy settings lead to generic help pages or options that only allow additional data sharing. These asymmetries contradict the DPDPA’s requirement for accessible, user-centric rights mechanisms.
Finally, vague statements such as “Your data might be used to improve our services” fail to give users meaningful visibility into how their information is processed. Under the DPDPA, such ambiguity prevents users from providing valid informed consent.
Embedding Privacy into Design Practice
Creating DPDPA-compliant interfaces requires a fundamental shift from manipulation to empowerment, from exploitation to transparency. The solution lies not in legal compliance as an afterthought, but in privacy by design as a core principle. Privacy by design involves using consistent typography, ensuring adequate contrast ratios, and implementing a logical information architecture.
Critical privacy information should be as visually prominent as service features. Legal jargon must give way to plain language—every statement about data processing should be understandable without specialised knowledge.
User interface and user experience (UI/UX) designers should use consistent button styling for consent and refusal options, ensure adequate colour contrast, and design with accessibility in mind for users with disabilities.
The content structure should clearly outline the specific purpose of processing, identify the data being processed, explain user rights and how to exercise them, and provide contact information for any data protection questions.
Interactive elements must use clear, action-oriented button labels, such as “I consent to email marketing,” rather than vague “Accept” buttons. It should implement equal-effort consent and withdrawal mechanisms, provide immediate confirmation of consent choices, and allow users to review and modify their consent at any time.
Cross-platform consistency is also crucial to ensure a uniform consent management experience across desktop, mobile web, and native applications, utilising consistent visual elements, terminology, and interaction patterns to prevent user confusion that could compromise informed consent.
Strengthening User Trust in the DPDPA Era
The DPDPA represents more than regulatory compliance—it embodies a fundamental shift toward respecting individual autonomy in the digital age. As data fiduciaries grapple with implementing these requirements, the temptation to rely on deceptive practices may seem attractive to maintain user engagement and data collection.
However, the long-term success of digital services increasingly depends on user trust and genuine consent. Organisations that invest in transparent, user-empowering interfaces position themselves as trustworthy stewards of personal data and build sustainable competitive advantage through ethical design practices.
The convergence of legal requirements, user expectations, and business imperatives creates an opportunity to redefine the relationship between technology and privacy.
By rejecting deceptive practices and embracing transparent design, organisations can create digital experiences that serve users’ interests while building stronger, more sustainable business models grounded in genuine user choice and trust.
As India’s digital economy continues its explosive growth, the choice facing businesses is not whether they can afford to implement these principles, but whether they can afford not to. In an increasingly privacy-conscious world, the organisations that respect user autonomy will be the ones that users trust with their data—and their business.
/filters:format(webp)/vnd/media/media_files/2026/01/20/akshayy-s-nanda-2026-01-20-19-05-28.jpg)
The author is a Partner at Saraf and Partners.
/vnd/media/agency_attachments/bGjnvN2ncYDdhj74yP9p.png)