Share
One of the core principles of humanitarian action is to Do No Harm (DNH). In the context of humanitarian data this principle can be applied in two ways: an exclusive focus on risk minimisation on the one hand, and balancing risk with utility on the other. The risk-minimisation account is intuitively appealing. If humanitarian action should cause no harm, it should follow that humanitarian data management should minimise the risk of harm as well. However, this approach can be misguided in practice.
Almost any data collection and processing in humanitarian action comes with some risk of harm in the case of a data breach, so fully embracing the DNH principle would mean not collecting any data. However, not collecting any data in the humanitarian setting comes with its own set of harms. Efficient use of data can improve the efficiency and speed of deployment as well as accurate targeting of intervention, and in some cases can even allow for proactive work, preventing potential disasters before they emerge. These issues were discussed extensively at a recent event on data responsibility in humanitarian action at Wilton Park on 20-22 May, which was hosted by OCHA Centre for Humanitarian Data.
The risk and utility trade-off is particularly salient in two aspects of humanitarian data management: data collection and data sharing. In order to make an informed trade-off between risk and utility, both need to be quantified. For example, risks need to be considered in terms of their severity, and their probability of occurring. Since the implementation of GDPR in Europe, most organisations have put practices in place not to share personally identifiable data, and many organisations have started to consider the potential risk of re-identifying respondents by combining multiple data sources. An example of the latter would be a survey on religious belief that doesn’t include questions on personally identifiable information but does include the IP addresses of the respondents. If there exist other public data sets that link IP addresses to names, these could be combined to learn about specific individuals’ religious beliefs. However, a conceptualisation of risk that focuses only on identification and re-identification is too narrow. It is better to think about the riskiness of the data as the extent to which it can be used to enable or support illegitimate interventions.
“To mitigate risks relating to data sharing, good protocols for information exchange need to be in place.”
-Tomas Folke, Chief Statistician, Ground Truth Solutions
The principle of only collecting data that is essential is known as data minimisation. It is a key principle of data security, as the best way to ensure that data isn’t stolen or misused is to not collect it in the first place. Donors can play a key role in supporting data minimisation efforts. Donor audits can create risks for affected people, so the need to effectively evaluate the relief effort must be weighed against the risks that data collection and storage entail. Harmonising donor requirements to cover a few key indicators would not just ease reporting burdens for operational agencies; it would also decrease the potential avenues of attack for bad actors as well.
To mitigate risks relating to data sharing, good protocols for information exchange need to be in place. Currently these exist bilaterally between certain organisations, but these should extend to apply multilaterally, to an entire sector or to an entire response to maximise impact. Another way to improve inter-agency data sharing is to use contemporary cryptographic solutions, which allows for data usage without giving up data governance. In other words, one organisation can run analyses on another organisation’s data and get aggregate outputs, without ever accessing the data directly.
There are a number of other data-management practices that can reduce the risks of the data falling into the wrong hands, such as ensuring that all computers in the field are password protected, and have firewalls and up-to-date antivirus software, operating systems and browsers. Additionally, the data files themselves should be encrypted. There are open-source programs that solve all of these tasks, so addressing them may be a matter of competence inside organisations rather than funding. Good practices can minimise breaches, but the sad reality is that most humanitarian actors will struggle to defend themselves against a determined and competent attack, so there need to be strong procedures in place to deal with breaches, if they happen.
One thing that became clear during the event is that high-level guidance in relation to data responsibility is already available (for example, see the OCHA Data Responsibility Guidelines, the ICRC’s handbook on data protection and ALNAP’s handbook). Now it is necessary to translate these guidelines into concrete best practices for different humanitarian contexts, and to set up feedback loops to evaluate these practices so that they may be improved over time.
This post was written by Tomas Folke, Chief Statistician, Ground Truth Solutions.