This article is part of the series — Colaba Edit.
The digital era and the accelerated growth of technological capabilities have changed how we live, work and play. While these transformations have allowed for more interconnectedness, made it easier for individuals to access more services and for businesses to expand into new markets, governments are now also required to adapt regulation in an agile fashion to ensure the protection of public goods throughout endeavours of economic growth and international collaboration.
Some issues that regulation need to contend with now are the lack of transparency around data collection and usage, ambiguity about the implications of disruptive technologies like artificial intelligence, and balancing public good with digital value creation. Governments need to be increasingly proactive in addressing these issues for their citizens, and the private sector and other actors also share the responsibility to ensure businesses do not infringe on individuals’ privacy and security. Two regulatory measures — the data trust model, a bottom-up approach to govern data ownership and access; and Canada’s Digital Charter, a top-down, government-led initiative to ensure data privacy and security — have broader lessons for all.
Data trusts
Data trusts provide a new framework to address the data governance and access issues that have been long debated. A data trust entails authorising an independent third party (i.e., not government or private industry) to manage the collection and use of data. Data trusts can encourage data interoperability as well as the ethical and compliant governance of data is extremely beneficial for businesses to unlock value from the myriad data sources available to us today, while the latter ensures there is a mechanism to protect the owners and contributors of data sources and entitles them to appropriate attribution and decision-making capabilities regarding the use of their data. We have already seen the outcome of private companies and governments wielding power over users’ data in the absence of a data trust mechanism, such as WhatsApp’s attempt to change its privacy policy, implying broader data sharing with Facebook, and the Singapore government allowing the country’s COVID-19 contact tracing data to the local law enforcement authority. In both cases, individuals who were using these services and had their data collected may have agreed to the original terms and conditions but did not have a chance to assess or approve the changes in how their data will be used when the organisations decided to expand their usage of consumers’ and citizens’ data. If a data trust governance structure were to be applied to these cases, it would allow for increased transparency and a fairer decision-making process, providing individuals with clear options and mechanisms to opt-out in advance of such large-scale changes.
While the core purpose remains the same — data trusts vary in terms of their degree of flexibility of data sharing, their funding models, and how their rules on protecting data rights and privacy are enforced.
Since the introduction of the concept in 2018 by the Open Data Institute, which defined a data trust as “a legal structure that provides independent stewardship of data,” organisations around the world have tried various data trust models to help data owners retain control over their data while still allowing businesses and governments to build solutions using that data. While the core purpose remains the same — to effectively manage data from a privacy standpoint and ensure data is not monopolised — data trusts vary in terms of their degree of flexibility of data sharing, their funding models, and how their rules on protecting data rights and privacy are enforced. Data trusts will continue to evolve, ideally reinforcing the need to protect people from harm while maximising value from data sharing and use by providing a proactive way to define the terms and conditions of data sharing, rather than users being left with no choice but to comply with ever-changing policies set by more powerful stakeholders.
Canada’s Digital Charter
While data trusts and novel governance modes continue to develop, legacy systems and regulations still need to be updated to be prepared for future advancements in digital technologies. In 2019, the Canadian government launched the Digital Charter to provide some ground rules for business and policymakers. The charter is a direct outcome of a series of national digital and data consultations conducted in 2018 to better understand the digital and data needs of Canadian residents. It is based on ten core principles that reflect the needs of Canadians in the digital economy. The principles broadly aim to address and enhance the accountability of both the private and public sectors in using consumers’ data, provide more control to Canadians over their own information, and also identify areas where the public sector needs to build public trust and adopt digital and data practices to enable open innovation.
The bill proposes granting new rights to individuals while imposing requirements on organisations, particularly to achieve greater transparency.
The Digital Charter provides a good foundation that not only addresses citizens’ privacy concerns, but also highlights the ways in which the government itself needs to modernise and become more digitally accessible. However, it is not a legal document, hence there is a long way to go in terms of ensuring businesses abide by these new principles, by revising current legislation and regulation. Once adopted, the Digital Charter Implementation Act (Bill C-11) will result in significant changes to the regulatory framework for the protection of personal information at the federal level and will govern businesses’ handling of such information in provinces that have not enacted their own similar legislations (currently, Quebec, Alberta and British Columbia have such laws). The bill proposes granting new rights to individuals while imposing requirements on organisations, particularly to achieve greater transparency. Some provinces have consultations underway to develop targeted strategies for their community, such as Ontario’s digital and data strategy.
Additional proposals are also being discussed to modernise the Personal Information Protection and Electronic Documents Act in Canada, which currently defines personal data as information about an identifiable individual. However, there is increasing evidence to show that there are ways to identify individuals by combining anonymised datasets or de-identified data points that may otherwise be considered non-identifiable personal information. Among other ideas to address this, a risk-based approach could be taken to address both privacy concerns and to enable innovation, in which there is increased transparency around the use of de-identified information in certain specified circumstances, with penalties for re-identification for any other use. Further consultations and learnings from peers around the world would be beneficial to improve regulation of otherwise de-identified or non-personal data.
There is increasing evidence to show that there are ways to identify individuals by combining anonymised datasets or de-identified data points that may otherwise be considered non-identifiable personal information.
The data trust model and measures like Canada’s Digital Charter only partially resolve the issues presented by innovation and new technologies that rely on large-scale data collection and use. It will take increased accountability for public and private sector leaders and the direct engagement of citizens to ensure equitable participation in the digital era.
The views expressed above belong to the author(s). ORF research and analyses now available on Telegram! Click here to access our curated content — blogs, longforms and interviews.