The International Humanitarian Studies Association: World Conference on Humanitarian Studies
By Katrina Petersen (Trilateral Reserch, UK)
Dates and location of the conference
27-29 August 2018
The Hague, The Netherlands
https://conference.ihsa.info
The main topics of the conference
The conference theme was (Re-)Shaping Boundaries in Crisis and Crisis Response. This conference presentations ranged from the intentions and practices of government control in humanitarian response, international law and the humanitarian principles, how people organize themselves in the face of humanitarian crisis, what it means to foster a local response, how to be upward and downward accountable in practice, and discussions on the rapid transformation of technology in humanitarian action. One of the primary foci within the talks presented was the many open questions regarding the importance of humanitarian principles and the effectiveness of the humanitarian response, particularly in light of the recent calls at the World Humanitarian Summit for localisation and greater ties to development activities. Another was how to be locally accountable in a globally funded/driven world. A third was challenging and trying to provide critical and practical tools for rethinking the role of technology, and more specifically ICT technologies, in humanitarian response. Another was trying to determine lessons learned (and how to encourage learning, not just training) in humanitarian history/practice. There were also a set of panels on gender, but they conflicted with others I attended.
Approximate number of attendees to the conference:
Around 300.
What type of paper we gave and an overview of what it concerned, feedback you received and from what type of stakeholders
The paper was a 15-minute panel presentation, on a panel entitled “Ethics and Technology in Humanitarian Settings”. The paper was entitled “Ethics and Technology in Humanitarian Settings: an iTRACK case study”. The paper drew from the socio-cultural impact assessment deliverable for iTRACK to examine how the social-cultural intersect with the ethical in humanitarian technology design and use. It used three case studies to discuss two main questions that arose from the related iTRACK deliverable:
- What cultural considerations need to be taken into account to really understand how privacy is a public good?
- How do cultural considerations change what is ethically acceptable or desirable?
The three case studies included
- The tracking feature and the challenges in ensuring privacy AND security and considering where privacy = insecurity; with a short discussion of how tracking could offer insights into local cultural practices or even this invasion of privacy could offer greater security to those being served (e.g. create accountability for worker/mitigate abuse).
- The underlying data sorting/analytics and the potential for culture creep that make ensuring ethical use a challenge, highlighting the importance of transparency that is multi-culturally legible without forcing an erasure of necessary differences that make for local understandings. This case also leads to questions about how do you ‘test’ the analytics when there is no “right” cultural answer in the abstract, or even from one local implementation to the next (unlike facial recognition where you either do or don’t get a face, even if there is bias in how it does that).
- The panic button and how interpretations of panic vary, and how this implementation of iTRACK makes it possible to shift the responsibility of threat identification (and security) from the individual to somebody else, or from an organisation to an individual; all of which blurs the question of blame.
The remainder of the paper focused on some questions anyone trying to understand the ethical implications of technology use in humanitarian contexts needs to consider:
- How to define responsibility?
- How to define justice and fair treatment?
- What is privacy, as a right, intended to offer?
- What elements of a situation need to be considered when determining if acts are ethical?
- What is a user required to know about a system?
Follow on questions/interactions came from:
- A worker at a French SME that aims to improve participatory design, highlighting the value and inquiring further about methods to help their work, when this kind of analysis, done right and in context, is slow yet the design world asks for fast results.
- A practioner from a humanitarian group in Syria that focus on environmental work following up about the relativity of ethics (to which I promptly said that could be a dangerous way of thinking because it is too easy to then say “that’s your ethics, not mine, so I don’t have to deal with it” as well as othering).
- A humanitarian scholar from the US who was asking about othering culture (to which I replied that culture here is complex, and situated, in that the designers of the project do not represent the local cultural contexts in which the tools will be used, so the acknowledgment of unknown/not-well-understood differences is important in order to mitigate culture creep)
- A socio-technical scholar from the US who wanted to follow up about how to use these questions to ensure technology/designers are accountable, not just the users or the funders.
- Dutch and Irish humanitarian workers who pulled me aside afterward to ask more about how tracking could offer cultural insights. I told them about the studies I saw a few years back, where the researchers were able to rain insights about how diverse people live/work/engage/value a city by seeing the differences in movements/timings in a city, offering better planning insights, etc. Both really liked that potential.