Expert Speak Digital Frontiers
Published on Oct 03, 2020
Pooling data rights to negotiate better with platform companies can help rebalance power in the data economy and address the growing gaps in individual data protections.
Data stewardship: Collective bargaining for privacy The ongoing pandemic has made it clear that the future is app-driven and has accelerated the adoption and use of technology. Tech-mediated remote work and learning, tele-health services, and doorstep deliveries have become ubiquitous. Consumers, in exchange for services, consent to part with critical information such as location and phone numbers. This online experience atomises the users, but negotiating with technology companies through frameworks of individual data rights as imagined in the Personal Data Protection Bill (PDPB) is insufficient. There is a need to reframe the conversation on privacy as a collective rights question, and develop tools that allow people to pool their data rights while creating mechanisms and governance frameworks that allow for collective bargaining on issues of privacy. This is critical for several reasons. First, while individuals part with personal information, applications also develop intelligence or inferences based on exhibited preferences and interactions with others on the applications – this data is not just about one person, but about the community they occupy. Therefore, privacy-related harms in the form of discrimination or misrepresentation impact a broader group of people and can be understood as “community harm”. Scholars regard privacy as a public good, as the decisions of one individual can endanger another, and suggest the need for group coordination on privacy-related matters.  Second, individuals are unable to bargain with technology companies or the state about their data rights. Individual consent is burdensome and it is almost impossible make informed, meaningful choices. Data value chains being complex, with a lack of transparency on how data is collected and used beyond what is visible, safeguarding privacy is a tall task. People can take small actions but it is unreasonable to expect individuals to manoeuvre the labyrinth of privacy policies, cookie notifications and such to protect themselves. Third, there is also a growing sense that if data is indeed an asset which can be used by technology companies to generate profits, then it can also be leveraged by communities to have greater economic rights on the data they generate collectively. Communities must be able to use their data to create value for themselves, and for the society at large. Despite these issues, the current frameworks of data protection and privacy are focused on individual data rights, and do not take into account the growing need for more collective protections. One reason for this is that western liberal ideas of personhood, on which current imaginations of privacy are based, do not recognise collective identities and the role of social relations, culture and environment in defining the self, and also ignore concepts of relational and group privacy. Recent writing on the sub-Saharan philosophy of Ubuntu aims to reconceptualise personal data, and reimagine privacy through a more communitarian perspective. While this is a valid critique, groups or communities of data are fluid and ever-changing, and privacy and data rights of one group may overlap or come into conflict with those of another. The only constant in the collective is the individual, and therefore the privacy of a group is a sum of the data rights of those constituting the collective. Pooling data rights to negotiate better with platform companies can help rebalance power in the data economy and address the growing gaps in individual data protections. This can be done through data stewardship, a set of governance and technological principles that can unlock the value of data while safeguarding rights of individuals and communities. These data stewards can take several forms and exist for multiple purposes – trusts (where a board of trustee govern data on behalf of a group), cooperatives (where data is owned and managed by members), unions (where data decisions are made by representatives of the group) – but fundamentally reorganise the way data rights are exercised, and serve to collectivise bargaining with platforms. Stewards are envisioned as representatives of the community, and have a fiduciary responsibility towards the people whose interests they put forth. Stewards are purpose-led and can take many forms. In an ideal scenario, the governance structures of the steward would be designed by the communities they represent. For example, drivers on ride-hailing platforms can design and be part of a steward, which in turn can bargain with the company on how their personal and collective data is used and shared. The steward enables drivers to build a coalition, and actively creates opportunities for redressal on data issues such as overcollection, unconsented data brokerage and workplace surveillance. Stewards can help actualise community data rights, and give people more say in how their data is governed. The recent Non-personal data (NPD) committee report is the first policy document that highlights the need for community data rights. The report suggests that these collective data rights will be exercised through an “appropriate data trustee” defined as “the closest and most appropriate representative body for that community, which will, in many cases, be an appropriate community body or Central/ State/ Local government agency”. While the suggestion of the governments as data trustees is a deeply problematic one, the spirit of the report is correct – communities must exercise their rights on their data, and do so through a representative, a steward. However, the suggestions of the committee need to be tested, to understand how stewards can be built such that communities are able to come together on data issues, and device methods to actively exert greater influence on platforms to safeguard interests, and protect privacy. In these times of rapid digitisation, power is concentrated in the hands of technology companies that isolate us in our online experiences, and make it impossible for people to protect their privacy in a fundamental way. Given the evolution of how data is organised and used, it is increasingly becoming a group resource, which must be governed in a consultative and collaborative manner so that collectives can present and negotiate their interests more effectively. The pandemic is an opportunity to re-examine and reorder systems and create new ones that better reflect the personal and collective needs of people.
The views expressed above belong to the author(s). ORF research and analyses now available on Telegram! Click here to access our curated content — blogs, longforms and interviews.