Massimo Marelli: 'only trusted technology can be successful technology'
The start of a new year is the perfect time to reflect on what has been and consider the challenges ahead. It can also become a space where practitioners can meet, exchange, and try to find local and global solutions to today’s digital dilemmas. The DigitHarium, launched by the International Committee of the Red Cross (ICRC) in partnership with a number of other humanitarian organisations, data protection authorities, academia, and civil society, plays precisely this role. Monthly dialogues and debates will explore the digital implications for humanitarian action.
This week, OHCHR special rapporteur on the right to privacy, Professor Joseph A. Cannataci was invited to share his experience in the first “digital dilemmas” dialogue on data protection in a Covid world.
ICRC’s head of the data protection office, Massimo Marelli, and co-editor of the event together with Stuart Campo of the OCHA Centre for Humanitarian Data, helps us understand the issues at stake and the role the organisation can play in bringing them to the table and contributing to a more efficient and respectful use of technology in a humanitarian setting.
What is the purpose of the DigitHarium?
The DigitHarium comes from two different streams of work. The first one is related to the data protection and humanitarian action working series. We have convened a large advisory group and an overall community of interest and practice since 2015. A group of humanitarian organisations, national societies, academia, civil society, and tech companies. The idea was to see how technology is changing the humanitarian landscape in terms of opportunities and challenges. So if we look beyond what is happening around us, we know, going back to Snowden and the Cambridge Analytica, how technology and data can be leveraged to control populations for government or corporate surveillance purposes.
We faced a dilemma where humanitarian organisations were either just jumping for it and saying, we'll figure it out, by testing and failing without really thinking it through. The problem is that failing with the data of people that are very vulnerable, actually puts them at risk. It also happened that organisations were too scared to incur into risks they did not fully master. We decided to provide them with some guidance, and enable the humanitarian sector to have the level of awareness and knowledge required to navigate the complexity of technology, understand the risks, mitigate them, and be accountable for the ones that the organisations decide to take.
We gathered quite a lot of interest across sectors, organisations, around the globe. And we realised that there was a value in convening, from this multi-stakeholder perspective, to make these questions accessible to a larger community of people. There was a momentum at the release and the launch of the second edition of the ICRC and Brussels Privacy Hub Handbook on Data Protection in Humanitarian Action, that coincided with the launch of the Humanitarian Data and Trust Initiative in collaboration with the Swiss Federal Department of Foreign Affairs and the OCHA Centre for Humanitarian Data. We needed to carry on this discussion.
The second component is the Humanitarium, which some people may already be familiar with. It is a forum, a space that was set up by the ICRC to convene multi-stakeholder discussions on topics of law, policy and ethics of humanitarian action. As the Covid-19 forced us all to engage with each other digitally, we thought it was important to create that forum in a digital setting, to discuss digital issues.
What would you say are the main issues related to data protection in humanitarian settings, more specifically in humanitarian emergencies?
There are many challenges that derive from the simple fact that humanitarian emergencies tend to evolve and unfold in places where legal frameworks are not particularly solid, or where conflicts often are a challenge to the rule of law. Already from the start, the environment in which personal data protection rights can be respected lacks consistence. In conflict, particularly, data is very important. There is a strong interest from all parties, and there has always been, in acquiring as much information as possible about everybody, your enemy, whoever may be supporting your enemy, your population, the enemy's population. Information is extremely valuable.
There is an additional challenge, which is the vulnerability of populations. Often people don't really have a choice, both in dealing with their authorities, their day-to-day life, and humanitarian operators. When a person really needs to make sure that there is enough food at home by the end of the day, and has to keep warm and healthy, notions around consent, information or having agency over data are often challenged by the necessity to take care of more basic needs. This means that they are more vulnerable to their data being misused. The responsibility falls onto humanitarian organisations, as they increasingly use very powerful and yet intrusive technology like biometric data.
Now, organisations that have been working in humanitarian settings for a long time are used to navigating those complexities. If we look at our policies, guidelines and operating procedures, particularly around protection activities, we notice that data protection principles and requirements are deeply embedded in all that we do. They have been an integral part of our protection approach. It isn’t surprising as protecting people's data is protecting people, and protection activities are about protecting people, just like data protection.
But what is extraordinary when considering tech developments in recent years, is technology’s capacity, in everyday life and everybody's pocket, to generate data at a rate that was unimaginable, even just 10 years ago. And it isn’t just generating it. Technology is also about the capacity to capture it, host it, and analyse it with artificial intelligence and machine learning. We can make sense today of data in ways that we would never have managed before, even with armies of people going through the information. This is where the importance of data protection is coming from. It is a tool that enables us to carry on with our mission, but in a different space, in a different world.
Do you think all organisations are equal in that respect?
I think adopting good reflexes is potentially easier for organisations that have a protection mandate, because it is part of the nature of what they exist to do. The very objective of the organisation is to protect people. And because of that, there is a natural alignment with the importance of really understanding the possible consequences of what we do before we do it. This then translates not only in that mindset, but also in the internal political engagement from the administration to invest in an area, which at times of budget cuts and difficulties, is incredibly important. Most humanitarian organisations do recognise the importance of data protection. The question then is about investing in it, from a governance perspective, but also from the research and development part of it. We do want to engage in technology to continue serving affected populations to the best of our means and of our capacity, and we want to get it right.
What opportunities have you identified in that sense?
One example is the use of biometrics. Biometric data is getting more and more prominent in humanitarian work. And there are clear reasons why. One reason is the fact that conflicts increasingly protract, they last longer. The average duration of the 10 biggest operations of the ICRC is now over 40 years. This means that we are less and less involved in one off emergency response, and more and more in repeat assistance. In conflict environments, we are being asked to do more with less. We are being held more accountable by donors, to avoid fraud, aid diversion, and duplication of aid for example.
Biometric data is a useful tool to insure all that. Some organisations have biometric data sets of about 16 million people, so the data is really being used at scale. Now, from our point of view, we haven't actually used biometrics yet other than in very specific use cases, where our mandate is to identify people, in missing persons situations or the separation of family members. But in the case of aid distributions, we have not been using them yet. In this specific case, we don't have a mandate to identify people, but to provide aid to people, which is not necessarily the same thing. So, identifying people through biometric data is more of a legitimate interest of the organisation in ensuring effectiveness and efficiency.
The ICRC’s Biometrics Policy, was adopted by the ICRC Assembly at the end of 2019. In the case of aid distribution, it provides for the possibility of leveraging biometric data on token. This means that the ICRC will not hold a copy of the individual data. The information will just be recorded on a card or a token, which the individuals carry with them. If, for example, the organisation is told, either by a a party to an armed conflict or an actor involved in a situation of violence to hand over the data, or it is out of the country, there is nothing that can be handed over. We don't have anything that can be lost or mishandled, we don't have a cyber honeypot, to attract potential attacks.
This can help us to achieve something really good like effectiveness and efficiency in repeat distributions. It also enables us to prevent aid diversion. We will know that aid has reached a physical person, we will not be able, however, to have the technical certainty of deduplication. And I will stress the “technical” certainty element because we will always apply all the techniques that we have been using over the years to minimize duplication, particularly by working with communities. This is why we launched a partnership with the Lausanne Polytechnic University to apply their most advanced computer science research to this problem and run a project to enable us to continue with the ‘on token’ approach – with all the benefits listed above- while ensuring also ‘technical’ deduplication.
Do you feel that there is a before and an after Covid problematic?
I would not say that there is a before and an after in the sense that the situation is a completely different landscape, with different conclusions and different problems. But I think there seems to be a consensus, and I agree with it, that Covid contributed to a huge acceleration in the dynamics, the digitalisation and use of digital tools. I think it also exponentially increased the awareness of the importance of data protection.
We have seen what emerged after Snowden, in the computer science community. Snowden gave birth to a new generation of engineers focusing their research on privacy enhancing technologies and security. And that generation faced the challenge of Covid. It is not just technology at the service of finding privacy, finding solutions to the problems that we have. It is also one that responds to the major issue of trust, which is at the heart of responsible and successful deployment of technology.
It is clear that, despite the fact that they understand the need of leveraging data, people are uncomfortable with new technologies that somehow have the capacity to control some parts of their lives. Data protection enables the use of technology and the leveraging of data in ways that do not infringe unnecessarily and unduly with the personal space.
What would be the next steps to actually rebuild or reinforce this notion of trust, and move forward?
The first one is privacy, data protection and security by design and by default. What we have seen with digital contact tracing is not just the creation of a data protection by design protocol for the creation of a digital contact tracing application, it is the development and implementation of a philosophy around a tool that achieves an objective in a new way. We need to embed this approach in everything new that we do.
The easiest way to present that and to explain this new way of thinking, is if we transpose it to topics like digital identity and immunity passports, for example. Technology is not just about digitising processes that we already have.
So in the paper world, in the analogue world, in order to demonstrate my identity, I need to carry around a passport that contains all my information. Whatever I need to do, I will have to show that paper and all the information that goes with it. A digital world is one that gives us an opportunity not only to digitise the information and make it more effective and efficient, it is a world that helps us to rethink the way in which we assert and use our identity.
If I am dealing with an authority to request a certificate or a humanitarian organisation to get aid, I don't necessarily need to show everything that I have on my passport, I will just have to assert one particular feature that is relevant for that interaction. Digital technologies enable us to improve the status quo. It is a rethinking of the way in which we interact and use our identity in that sense. This is highlighting notions of minimisation, retention and privacy by design. This is where trust will be in the in the future.
Is trust also a matter of learning and education?
Yes, it is indeed one of the key pillars of the Humanitarian Data and Trust initiative mentioned earlier, alongside the other pillars that are policy and dialogue, and research and development. We need to make sure that people across organisations are comfortable discussing and engaging with the complexity of new technologies and data protection. This can only happen if people are trained. This is why the ICRC Data Protection Office is developing a training and a certification program with the University of Maastricht European Centre on Privacy and Cybersecurity, for Data Protection Officers in humanitarian action, as well as additional modules with the EPFL and ETHZ.
I am really convinced that we cannot apply the law to technology without really understanding the technology. We cannot take policy decisions on the deployment of certain technologies without really understanding the nuances of technology, and this is why these trainings are so important.