The emergence of accessible connectivity created vast repercussions in the public sphere. It allowed users to engage in the digital space for various needs, such as education, work, health, e-commerce, and communication. Digitalisation and data collection became very useful in all sectors, including for humanitarian aid providers. However, the mechanisms of “Do No Harm” while engaging in the digital sphere is still challenging topic.
The COVID-19 pandemic revealed that the internet is a vital necessity, along with concomitant topics like digital security and privacy. According to the International Telecommunication Union
(ITU), “In the first year of the pandemic, the number of Internet users grew by 10.2 per cent, the most significant increase in a decade, driven by developing countries where Internet use went up 13.3 per cent. In 2021, growth returned
to a more modest 5.8 per cent, in line with pre-crisis rates.”
Important to say that this program responds to one of the most basic humanitarian needs on restoring contact between families separated by armed conflict, violence, migration, and other forms of displacement.”
In this regard, we need to remember that apart from conventional digital needs and marketing data, there is humanitarian private data that humanitarian aid providers collect. A wide range of humanitarian needs based on personal information is provided. It consists of records of services received, family details, IDs, passports, photographs, biometrics data, health data, and interview replies. All this personal data being housed in the digital sphere pose certain risks.
For instance, in January 2022, the International Committee of the Red Cross (ICRC) was targeted by a sophisticated threat actor, and the data of more than 515,000 people worldwide
was compromised. According to the ICRC-Geneva statement
, “…it acknowledged that data leaked is related to the restoring family links program that had been the target of cyber attackers, causing severe disruption to ICRC and 60 National Red Cross and Red Crescent Societies operations. Important to say that this program responds to one of the most basic humanitarian needs on restoring contact between families separated by armed conflict, violence, migration, and other forms of displacement.”
On 30 August 2019, the United Nations’ (UN) Geneva offices
were hacked. “According to a confidential UN report from The New Humanitarian, dozens of UN servers—including systems at its human rights offices and its human resources department—were compromised, and some administrator accounts were breached.”
According to a survey by CohnReznick, “More than two-thirds of non-profits failed to assess their levels of cybersecurity risk.”
The occurrence of cyberattacks at well-established organisations proves that no one is secure ing the digital space and that we need to be follow a ‘zero trust’ policy when it comes to personal data. One can only guess at how many Non Governmental Organistions (NGOs) have been victims of cyberattacks, especially in unstable political territories where digital skills and literacy are at a low level. According to a survey by CohnReznick, “More than t
wo-thirds of non-profits failed to assess their levels of cybersecurity risk.”
Lack of skill and awareness the reason behind lackadaisical safety measures
The ITU figures
show that the world is still struggling with people being able to have basic Information and Communication Technology (ICT) skills. There are still countries whose capabilities have not been assessed due to unstable political situations.
The non-profit sector depends on ICT, which includes telecom, licensed software, devices, and tech platforms, but the proper measures to secure this access are not always in place. It is a long struggle; sometimes, it is a confrontation as there is resistance to accepting a new way of working and bureaucracy. Moreover, it is also a matter of the donor organisation’s requirements for the data collected and the project implementor’s neglect of essential digital safety preparedness.
This resistance is mainly due to the novelty of the digital skills topic and digital rights concepts; because the sense of an actual physical threat is not immediate, people do not seek essential safety measures. The gravity of the situation only strikes when people and systems fall prey to cyber-attacks, scams, phishing, and digital identity thieving.
Personal data collection and processing occur throughout all stages in the humanitarian response mechanism—to increase situational awareness, conduct analysis, support ongoing operations and secure the delivery of assistance.
The humanitarian sector needs to protect its virtual information. It needs to develop solid strategies and mechanisms to enable its netizens to use cyberspace at the same level as in the materialistic world and have safeguards before damage happens.
Personal data collection and processing occur throughout all stages in the humanitarian response mechanism—to increase situational awareness, conduct analysis, support ongoing operations and secure the delivery of assistance. Although in the countries where this data collected, data protection mechanisms usually either do not exist or the mechanisms that do exist are not proper and do not emphasise the individual’s consent to protect private data.
UNCTAD (United Nations Conference on Trade and Development) reports
that about 137 out of 194 countries have implemented legislation to secure data protection and privacy. Africa and Asia show different levels of adoption, with 61 and 57 per cent of countries having adopted such legislation respectively. Only 48 per cent of least developed countries have passed such legislation. Unfortunately, even laws with proper legal and digital rights provisions cannot ensure safety in those territories outside of its jurisdiction. For example, the European Union’s General Data Protection Regulation (GDPR) sets forth rules for processing personal data and provides the definition of digital rights; however, it cannot enforce its provisions in many territories where humanitarian aid is distributed.
UNCTAD (United Nations Conference on Trade and Development) reports that about 137 out of 194 countries have implemented legislation to secure data protection and privacy.
It may seem an arduous task, but it is possible to mitigate risks and secure sensitive data through constant education of the stakeholders and a systematic Data Protection Impact Assessment (DPIA).
Education of the stakeholders needs to start with digital safety training and basic security precautions, where the donor organisations support additional opportunities for the staff to update their knowledge, access the most updated knowledge base, and conduct due diligence of third-party service providers.
The next step is to conduct DPIA, which helps to understand data collection during project activities, form policies for personal data protection, provide risk analysis of the rights of data subjects, classification of the data, diminish damage after cyber incidents, and provide additional safeguards to protect data.
For example, the UNHCR (United National High Commissioner for Refugees) requests access to the list of all residents in a refugee camp every month maintained by the CCCM (Camp Coordination and Camp Management) to examine their eligibility for cash assistance. The DPIA would examine the necessity of sharing, risks of regular data transfers, the categories of data to be shared, and other potential risks to data rights. Since refugee camps are at risk of intrusion and they may contain hard copies and digital versions of documents, the DPIA would explore
how this data has been proceeded and stored, including the digital service provider’s safety measures on information integrity and availability of the data uploaded.
The DPIA would examine the necessity of sharing, risks of regular data transfers, the categories of data to be shared, and other potential risks to data rights.
DPIA enables the identification of privacy risks for individuals, develops risk mitigation strategies, and protects the actors’ reputations. It implies a description of the activities; the type of data and retention method; and the length of the storage; the devices and controls used, and in some cases, it can even provide due diligence. The French Data Protection Authority (CNIL)
has provided three guides on Privacy Impact Assessment, which can be a starting point for data risk assessment.
In conclusion, it is important to be committed to the “Do No Harm” pledge before initiating data collection. This requires consideration of the security/political context, sensitivity of questions asked of individuals, the ability to do quality control in the field, humanitarian imperative, possible incidents disclosures, and digital literacy training.
The rights holders should not be perceived as part of the business and be targeted as a marketing focal point. Moreover, the partnerships between the humanitarian sector and private sector need to be based on mutual trust and accountability. The prevalence of the data subject's fundamental human rights and freedoms has to be prioritised over the organisation’s profit and annual Key Performance Indicators.
The views expressed above belong to the author(s). ORF research and analyses now available on Telegram! Click here to access our curated content — blogs, longforms and interviews.