The humanitarian society is on the cusp of a technological revolution, with new ICT solutions being developed and implemented in the every day work of humanitarian organisations. Emerging technologies that will act to shape the next few years are complex, perhaps not in their use but certainly in their make up and in the diversity of consequences they can bring about. Unsurprisingly, many of the new solutions will require end users to give up personal information related to identity, place, and their habits. The “reward” for of all of this will be their safety and security on a mission. However, one does have to remember that technology, per se, is amoral, apolitical, it is neither good nor bad. Only human use of technology for good or bad is possible.

Against this backdrop it is important to understand the socio-cultural considerations and impacts of these solutions. This is a step that goes beyond a traditional privacy and data protection impact assessment of technology tools. This report, therefore, undertakes an analysis of the social and cultural implications of the use of the iTRACK tool and of each of its components namely: StaffSense, DSE and QR, on board sensing and threat detection and the use of multiple components. It comes to the following conclusions:

• StaffSense: The main social and cultural implications of the StaffSense component have been identified as issues revolving around safety and privacy, trust and misconceptions of threats and errors. This has potential impacts on the principle of safety and security, dignity, trust and transparency and Do-No Harm. However, the report outlines that many of the issues discussed, are already existent in humanitarian settings due to the presence of other tools and processes. Further, the report notes that the information collected by SocialSense will be password protected and encrypted similarly to the other components in iTRACK, rendering it extremely difficult to hack into and minimizing many of the risks highlighted. The report also analyses the various steps the Consortium has taken to minimise these risks as well as recommendations for its implementation which will significantly help in reducing these implications. It concludes, therefore, that the introduction of StaffSense will not have any severe new social and cultural consequences, at least no more than are already in existance.
• DSE and QR components : The report argues that with regard to the DSE the tracking of employees may have a significant impact on the privacy of individuals and data protection. Thus, possibly encroaching on the principle of autonomy, dignity, privacy and trust and transparency. The report highlights a need to convey to humanitarian workers that the added security benefits of the system outweigh the negatives of being tracked particularly as the system will avoid data leakages. With regard to the QR the report indicates that by tracking assets humanitarian workers may feel scrutinised and blamed for losing assets, which may further impact their ways of working.
• SocialSense : This section has identified the various social cultural considerations related to SocialSense. These have mostly related to issues of safety, security, misinformation, the principle of non-discrimination and justice. In fact, SocialSense could put humanitarian workers in danger by engaging in fake or misleading content. Also, it may allow for some groups to be prioritised while others may be excluded. It may also cause the identification of certain groups and individuals which could put them at risk of being targeted. The analysis notes however, that SocialSense is just one of the components within iTRACK which seeks to add to on–the ground intel. Other components may also gather additional information in order to create a broader picture.
• On-Board Sensing and Threat Detection : This section analyses the various socio-cultural issues related to the implementation of on-board sensing and threat detection in iTRACK. The issues discussed include misinterpretations and blind spots, increase of risks and changing relationships and dependence on technology. These involve references to principles of non-maleficence, non-discrimination, justice, transparency, trust and collaboration, security and safety. Particularly with regard to iTRACK’s most visible aspect, its camera, which could cause uneasiness, and the not so visible aspect, its algorithm, which could have discriminatory tendencies, opposing the principle of equality and non-discrimination. However, as the report discusses, iTRACK will only categorise objects and not people, meaning that its algorithm will not oppose the principle of non-discrimination by creating bias or having discriminatory tendencies. Nonetheless, the report insists the system must be deployed with caution, particularly as determining the potential and actual ethical impact of an algorithm remains difficult.
• The implication of multiple components: This section analyses the implication of iTRACK’s multiple components and states that iTRACK has several social and cultural implications, such as the potential of being used as a weapon of war. This risk, highlighted in this section directly opposes the principle of safety and security and dignity. Changing relationships and trust between host governments and beneficiaries is also considered as a key social and cultural implication of the iTRACK system as was the lack of local voices in its development. It is argued in the report that these issues have the potential to create additional risks and a disconnect between aid workers, beneficiaries and host governments, directly going against the principle of trust, transparency and collaboration, dignity and non-discrimination. The report stresses that every organisation can choose between the different modules iTRACK has to offer, thus allowing them to choose between the risks that come with them.

Overall, when introducing new technologies it is important to consider whether the benefits clearly outweigh the risks. The report considers that in iTRACK’s case, they do and with the appropriate training, education and roll out, many of the issues discussed in this report could be avoided or minimised.

When moving forwards in this field, however, there are various recommendations that should be considered. These include the following:


• Significant training must be undertaken in order to help humanitarian workers understand some of the potential risks associated to the system and how to minimize these.
• The system must be introduced in such a transparent and open way that it demonstrates its positive implications, to allow beneficiaries to understand what the system seeks to achieve. As well as for humanitarian workers to understand that although they may be tracked by their organisations this will signify additional safety and security.
• Additional research must be undertaken in the field of humanitarian technologies and their impact on host countries and beneficiaries.
• Research should be conducted to analyse how issues related to gender and equality may be affected by the introduction of new technologies in the field.
• Any technology should be introduced in a way as to avoid inequality. Additionally, the technology should be open to everyone and not just to certain genders, ages, the wealthy, powerful or the technologically advanced.
• The consortium acknowledges that additional research is needed to understand the prevalence of the effects of algorithm driven decision-making systems, such as iTRACK, and to discern how to minimise the inadvertent justification of harmful acts. Which, in iTRACK’s case, may become more obvious once it has been deployed.

Back to Public deliverables