Nethope Logo

Empowering Humanitarian Response Through Crisis Informatics

Leveraging Data for Effective Decision-Making in Emergencies

February 22, 2024

In the age of polycrisis, humanitarian organizations responding to emergencies must do so as quickly and efficiently as possible. Having the right information for targeting actions correctly, and making good decisions, is essential both for these organizations and those they serve. 

And there is a wealth of data available - be it through international institutions such as the UN’s Office for the Coordination of Humanitarian Affairs (OCHA) which is sharing data through ReliefWeb and the Human Data Exchange (HDX), governments, global and local technology and telecom partners, and of course through the International NGOs and their local implementation partners who are collecting data on the ground. 

However, organizations struggle to acquire what is needed in any given context; particularly for emergencies, it is crucial that data is timely, accurate and trusted, as there is no time available for verification, just the urgent need to process, and use insights efficiently. 

Here is where the discipline of Crisis Informatics comes in. 

There is no commonly agreed upon definition of Crisis Informatics. For the purpose of our research, we define it as: the technology to collect, process and distribute information required to make decisions and take actions, and the interconnectedness of the different actors in a crisis situation, during all phases of a humanitarian or natural disaster (preparation, mitigation, response and recovery). 

From an initial survey of a small number of NetHope Members and partners (list at the end), these are the findings.

A complex ecosystem

We have found that the layout of the ecosystem also varies for different organizations, as illustrated through these example ecosystem maps (click to enlarge).

Conversations with some of our NetHope Member organizations (all of them global nonprofits) show one of the key challenges very clearly - the complexity of the ecosystem. By this they mean the many different parties who may be owner, contributor, user, or intermediary for data (sometimes taking more than one of these roles) in a crisis situation, and the many different types of information that are exchanged in many different ways. It takes time and resources to make sense of all this – from establishing which are the most relevant datasets for a specific context, to validating quality and accuracy, to standardizing and formatting different datasets so they can be consumed easily, to making sure any relevant data privacy regulations are followed. Time and resources that nonprofits do not tend to have, meaning they are not in a position to make the best use of the data, or worse, make suboptimal decisions based on incorrect or incomplete data. Navigating through this complexity requires resources such as staff time and expert ecosystem navigators to access accurate data and contribute effectively. The presence of multiple data streams creates uncertainty about which source should be trusted. Therefore, allocating resources to determine the most relevant source based on factors like age, source, and curator is essential.

Challenges of data collection

The challenges organizations face when acquiring relevant data are many. The top cited barriers to crisis informatics data collection were lack of common standards and/or minimum standards, organizations not having a systematic information management (IM) process, data quality being questionable, and the lack of funding and other resources to overcome these barriers, let alone curate and derive insights from the acquired data. These findings are remarkably similar to the concerns raised in earlier research conducted by NetHope. Agreeing on common standards, their interoperability, and the application of those standards by the actors in this humanitarian data ecosystem clearly remains a challenge that is aggravated by the lack of resources dedicated to standardizing data structures and representations. (See NetHope’s related work in Frontline Humanitarian Logistics, Data for Disaster Preparedness, Benchmarking NetHope Members to CIS controls for cybersecurity).

The challenge is multilayered. Lack of predictable and repeatable standards between agencies (and their associated ways of working), encourages a single agency to try to derive its own organization-wide standard for lack of anything to work with. However, this is often doomed to fail, stall, or be incomplete in the larger INGOs, because of their highly interconnected nature with that same ecosystem wherein, to participate in consortia, or work with multilateral donors etc., the agency is forced to adopt various ad hoc IM approaches depending on the other actors it is working with in each situation. For the single agency this results in multiple internal projects with their own data architectures and processes, strongly incentivized to stay tied to an “outside” reason and seemingly unchangeable. Here we also reflect on the diverse data structures encouraged by donors and funders, the myriad of different data architectures due to ‘out of the box’ technology and digital tools, and even government and legislative diversity about data structures and treatments – all of which act in concert to discourage harmonization and standards.

Half of the survey respondents mentioned the need to establish a new IM process or method in every crisis, which reflects this challenge of replicating an IM process across crisis contexts - and may also point to a lack of organizational knowledge management standard practices, leading to a tendency of “reinventing the wheel” in new emergencies. Lack of systematic IM processes can be an indicator of low digital skills capability for an organization, specifically data literacy, which was also a concern in the earlier research. Loss of historical data and important data not being collected may also point to skills challenges and gaps in data governance.

A number of responses also specifically mention technology adding an additional barrier - low internet penetration, lack of common technology, and low information technology usage. Given today’s digital world, restoring/setting up connectivity in areas impacted by disaster, and having easy to use tools to collect data, should be a given, but this is not yet the case.

Several respondents also stated in the open-ended response that their biggest need is understanding what information the people on the ground, and those making quick tradeoffs in pursuit of speed of response, actually need – be they responders or the communities affected. There could be competing priorities from different organizational mandates and donor requirements, so assessing this complex situation quickly and establishing the data landscape is often a major first hurdle in crisis data collection. Another issue that came up twice in the open-ended response is the difficulty of accessing government data or reaching certain affected population segments to collect data about them.

Based on your knowledge and experiences, what do you think are the barriers to crisis information management for most humanitarian actors in terms of data collection? (n = 19) (click to enlarge)

Barriers to communicating data

Many different international and local humanitarian actors get involved in any given emergency, so in an ideal world, they would be sharing and exchanging information that is relevant to all of them. However, the reality today is that there are many roadblocks to making that happen. Organizations we spoke to have mentioned several reasons for that.

Based on your knowledge and experiences, what do you think are the barriers to crisis information management for most humanitarian actors in terms of communicating data with other humanitarian actors? (n = 19) (click to enlarge)

Most survey respondents stated they need a systematic process/platform for sharing data – ideally across multiple projects, locations and time frames. There is a general sense that a centralized information system for crisis data sharing is absent. Every time a crisis occurs, responding organizations form a temporary coalition and during the process one or more take the leadership in coordination around data. This ad-hoc coordination is not effective. It is an inefficient use of resources and in the capacity building of expertise, and (once again) contributes to individual organizations having to solve their own data needs. 

Some actors plainly don’t want to share data. This could be, quite rightly so, because they have concerns about the legal and ethical implications around data privacy and may not have the necessary processes and tools in place to manage that; or it could be that there are conflicts of interest or security considerations. In some cases, they simply cannot justify the return on investment (ROI) for actively sharing the data to a more central pool. As is the case with more traditional Knowledge Management – incentivizing knowledge sharing to a collective is notoriously hard, noting that we and organizations are encouraged to hoard data, make ourselves indispensable and focus on immediate return on effort vs long term community-wide gains. This is exacerbated in situations of heightened stretching of resources, like a humanitarian crisis or disaster response, this barrier to sharing is high enough to discourage sharing. 

Several of the barriers and the sometimes lack of immediate ROI mentioned, have to do with the data itself - it is inconsistent, from multiple and sources that vary in quality, there is no common terminology, and there is too much data to process that is not easily interoperable. At this point in time, it seems that the largest issue is not the absence of data for the sector, but ease of assessing the quality of the data available, and then the creation of actionable insights from the multiple data streams that are available. This, again, may point to a skills gap in data literacy, and raises the question to what extent technology could support the analysis of large amounts of data that do not follow a standard structure. 

It should also be noted that that many organizations are not connected within the established humanitarian networks, and information and data is most often not available in the languages that are in proximity to the disasters and crisis. From this we can then also infer a gap in the involvement of local actors and implementation partners in the overall “data ecosystem” in crisis – whether as data and information providers, or as data and information consumers to inform their work with affected communities. This lack of their involvement also somewhat contributes to the uncertainty about the validity, timeliness and quality of local insights and data – which once again discourages participation in the data ecosystem even by the larger multinational agencies. 

Where does this leave us?

Building on the findings from this initial survey, we will explore some of the areas of concerns in more depth. A strong focus needs to be on questions such as:

  • What are the actual needs of actors on the ground, and how can we prioritize they get this data and have the skills to get relevant insight from it? 
  • How can we better integrate local actors, including impacted populations, in the collection and use of data in a crisis situation? 
  • What are the skills gaps that need to be addressed for humanitarian organizations to have the right processes and tools in place that provide repeatable information management? 
  • How might technology help - from the very basic needs of connectivity to collect and access data, to much more sophisticated tools such as the use of AI to generate insights from data? 

Acknowledgements

We thank the following NetHope Members and Partners for the valuable input to this research: Cisco, Concern Worldwide, Danish Refugee Council, Emergency Telecommunications Cluster, Humanitarian OpenStreetMap Team, Microsoft Disaster Response, Norwegian Refugee Council, Trek Medics, and World Vision International.

Twitter @NetHope_orgFacebook NetHopeorgYouTube iconLinkedIn Company NetHopeInstagram nethope_org
crossmenu