-
CENTRES
Progammes & Centres
Location
India’s new data law misses critical protections for users of assistive technologies, raising concerns over consent and data minimisation
Image Source: Getty
Assistive technology (AT) is integral to the health sector, with over 2.5 billion people worldwide relying on some form of it. Demographic shifts coupled with epidemiological transition have predicted the need for AT to reach 3.5 billion by 2050. ATs have several benefits, including enhancing the quality of life, improving health care access, enabling early detection of health issues, and fostering patient and caregiver engagement. Nevertheless, such technology raises several ethical implications, including concerns around user privacy, surveillance, confidentiality, and management of large volumes of data.
ATs have several benefits, including enhancing the quality of life, improving health care access, enabling early detection of health issues, and fostering patient and caregiver engagement.
This piece discusses the various types of ATs and the data-intensive nature of such technologies. It examines how the Digital Personal Data Protection Act, 2023 (DPDPA), and the draft DPDP Rules 2025, fall short in addressing these concerns, particularly around data minimisation and meaningful user control. The article also provides recommendations for policy measures to help tackle this grey area.
ATs are technologies that leverage computing capabilities, robotics, and machine intelligence for assistive purposes. These include wearable and non-wearable devices, smart devices, tablets, telephones, computers, cameras, robots, and voice-activated technology. ATs can be utilised to monitor activity, sleep, glucose, blood pressure, heart rate, falls, frailty, cognitive function, and medication or treatment adherence; they also facilitate service deliveries, such as remote provider visits, education, reminders, and health information sharing. Figure 1 illustrates the various types of ATs.
Figure 1: Examples of ATs
Category | Example | Purpose |
Hearing Aids | Hearing aids, Cochlear implants | To enhance or restore the hearing of individuals with hearing disability |
Mobility Aids | Wheelchairs, Walkers, Canes | Helps individuals with physical disabilities move independently. |
Vision Aids | Screen readers, Braille displays | Assists blind or visually impaired users in accessing digital/text content. |
Communication Aids | Speech-generating devices, AAC (Augmentative/Alternative Communication) apps | Supports individuals with speech impairments in communication. |
Cognitive Aids | Reminder apps, Text-to-speech tools | Helps people with learning disabilities or memory challenges. |
Prosthetics | Artificial limbs, Exoskeletons | Replaces or enhances missing or non-functional body parts. |
Adaptive Software | Voice recognition, specialised keyboards | Allows users with motor disabilities to operate computers/devices. |
Source: Author’s Own.
ATs can play an integral role in increasing accessibility not just for chronically ill or persons with disabilities but anyone with temporary impairments and ageing-related challenges, to name a few. ATs help individuals become more independent by providing life-altering functionality solutions. By bridging accessibility gaps, these tools promote independence, productivity, and inclusion in education, work, and daily life.
The Digital Personal Data Protection (DPDP) Act, 2023 is based on the principle of data minimisation, following the footsteps of the European Union’s General Data Protection Regulation (EU-GDPR). Under Section 6 of the DPDP Act, this principle requires that the collection, storage, processing and retention of personal data be limited strictly to what is relevant and necessary for a specific purpose. The idea is simple: if the data isn’t essential for delivering a service, it should not be gathered in the first place.
Applying this principle in the context of ATs, such devices should ideally limit the collection of sensitive health, location, or behavioural data unless it is strictly required for the functioning of the device or the service. While this principle has certainly found its way into the law, its implementation remains a challenge.
ATs often collect a wide range of personal and sensitive data–from biometric inputs and Global Positioning System (GPS) locations to behavioural patterns and emotional responses— without offering clear justification.
In fact, one of the biggest challenges in implementing the data minimisation principle for ATs is determining what data is actually necessary. ATs often collect a wide range of personal and sensitive data–from biometric inputs and Global Positioning System (GPS) locations to behavioural patterns and emotional responses— without offering clear justification. For example, a speech-to-text tool may need voice data to function, but does it need to store entire conversations? A navigation aid may require real-time location data, but why should it store travel history? These blurred lines make it difficult to hold developers accountable. Such excessive data collection raises serious privacy concerns, especially for vulnerable users who depend on these technologies daily.
This failure to uphold the principle of data minimisation is closely tied to how consent is currently designed and implemented in most ATs. Instead of offering users meaningful control over what data they share and for what purpose, ATs typically operate on a ‘take-it-or-leave-it’ basis. There are no options to opt in or out of specific data types selectively, nor are there clear explanations of what each data stream is used for.
This approach directly contradicts the intent of the DPDP Act, which places user empowerment at its core. Users, particularly those with disabilities who depend on these technologies for mobility, communication, or independence, would effectively be coerced into giving up their data, because opting out would mean losing access to critical support systems.
The lack of regulatory enforcement or guidance around selective data sharing, opt-out rights, or purpose limitation in consent forms means that companies can continue to operate without accountability even when dealing with sensitive health and disability-related data.
Thus, while the DPDP Act sets a progressive tone, it fails to resolve this grey area in practice. The Act does not explicitly prohibit such an approach, nor do the current draft DPDP Rules (2025) provide sufficient clarity on what constitutes valid, user-centric consent. The lack of regulatory enforcement or guidance around selective data sharing, opt-out rights, or purpose limitation in consent forms means that companies can continue to operate without accountability even when dealing with sensitive health and disability-related data.
While the DPDP Act, 2023 establishes progressive principles like data minimisation and purpose limitation, the recently released draft DPDP Rules, 2025, expose several regulatory gaps, especially in the context of Assistive Technologies (ATs). These Rules fall short in operationalising the Act’s intent, particularly in addressing technology-specific risks that arise with data-intensive tools like ATs. For instance, there is no explicit mention of consent customisation, opt-out mechanisms, or safeguards for sensitive health or disability-related data.
Without strong regulatory oversight and clearer implementation frameworks, the Rules risk leaving individuals, especially those with disabilities, at the mercy of private AT developers, with little accountability for data misuse or overreach.
Increasing use of ATs raises a big concern about how excessive data collection will be tackled, given that the current laws and policies do not effectively address this issue. It is important to determine what constitutes ‘necessary data’ under the DPDP Act. The Draft Rules would also need to clarify the aspect of consent and purpose limitation further. Until this gap is addressed through clear rulemaking or amendments, the promise of the DPDP Act to protect and empower users remains compromised. For individuals relying on ATs, privacy remains a luxury, not a right. To truly align with the DPDP Act’s intent, ATs must embed privacy-respecting practices such as clear data-use justifications and user rights to opt out of non-essential data collection without losing access to the device’s core functions.
The Draft Rules would also need to clarify the aspect of consent and purpose limitation further. Until this gap is addressed through clear rulemaking or amendments, the promise of the DPDP Act to protect and empower users remains compromised.
Solutions such as Privacy Impact Assessments (PIAs) and Wearable Impact Assessments (WIAs) can be made mandatory for all AT developers and providers. Further, these PIAs and WIAs must be conducted in consultation with disability rights organisations and AT users. The assessments should result in publicly available summaries in accessible formats (audio, braille, simplified text) to maintain transparency. By embedding these assessment processes into India's AT ecosystem, and making slight changes to the existing laws, we can create a continuous feedback loop between users, developers and regulators, ensuring privacy protections evolve alongside technological innovation.
Tanusha Tyagi is a Research Assistant with the Centre for Digital Societies at the Observer Research Foundation
The views expressed above belong to the author(s). ORF research and analyses now available on Telegram! Click here to access our curated content — blogs, longforms and interviews.
Tanusha Tyagi is a research assistant with the Centre for Digital Societies at ORF. Her research focuses on issues of emerging technologies, data protection and ...
Read More +