By Rolf Bakken
I have a confession to make… I will never be a techie. I will never be fascinated by how an appliance works, never be thrilled by opening something to see how it is wired, and never be carried away by software codes or new, integrated solutions.
I am fascinated by the art of information management though: how data is collected, collated, analysed and processed into information. When it comes to emergency response, we always endeavor to improve the quality of information to make more qualified decisions in disasters. In a rapidly changing emergency environment, information is rendered useless for decision-making if not disseminated at the right time to the right recipient in the right format.
In my view, the best app ever invented for emergency management is, without a doubt, pencil and paper; it never runs out of battery, never stops working when wet, and never is incompatible with local systems. Innovative technological advances made over the last decades have opened up possibilities we deemed as science fiction only a short while ago, but the question now is how best to use the technological innovations surrounding us.
In January, I participated in a workshop arranged by the Assessment Capacities Project (ACAPS; www.acaps.org) that gathered more than 50 of ACAPS' roster members, partners and staff with decades of experience of assessments and information management processes in humanitarian response operations. They had a diverse background including UN agencies, NGOs, and academic institutions, all dedicated to better coordinated responses and strengthening information management and decision making processes. The aim of the workshop was to share new thinking, methodologies and tools for assessments, and also to share experience and good practice from deployments.
Recent technological developments have given rise to new ways of gathering and sharing data in humanitarian crises. This is great, but new developments sometimes seem to appear and evolve faster than practitioners' ability to absorb and operationalize them. During the workshop we were asked: “How do we decide which technology to use in humanitarian response operations?” The aim was to generate a set of criteria by which to judge new and existing technologies. Here's what we came up with:
Ease of use – Technological solutions should be easy to learn and easy to use. A cost-benefit approach should be taken, where the usefulness of the solution should be compared to its ease of use.
Resilience – Any technological solution should be able to withstand the austere working environment often found in humanitarian crises. It also needs to be able to withstand the often less-than-gentle treatment it may receive from technically inexperienced staff. One participant described the “Fisher-Price test,” and anyone who has seen a two-year-old play with/attack their favorite Fisher Price toy will know what this means!
Tried and Tested – It was a firm belief amongst the participants that new solutions should be thoroughly tested and evaluated, in the context of emergency use, before being rolled out. The height of an emergency might be the right moment to introduce new solutions, but this should be done as a real-time field test, supplementing other solutions.
Correlation – We should be careful not to let the tools control our activities but, rather, use the tools to enhance the efficiency of these activities. We should avoid trying to push a round peg into a square hole. All too often we have seen that we are so fascinated by new and innovative tools that we try to manipulate the reality (i.e. the data) to make it fit a particular piece of software or other application.
Relevance – A well-functioning information management process is not an end in itself, but just an important stepping stone in a well-functioning decision making process. Technological solutions should be introduced and applied with this in mind. They are only the grease that oils the machinery.
Representative – Do the solutions/resources/tools represent the wishes and preferences of the affected population and local responders? Are we introducing and applying something that is of interest only to a few from the international community, and not those directly affected by the crisis or those who will be working on the response for the longer term?
Consequences – Does the solution have unintended consequences? Could it become an obstacle and increase an already packed workload rather than supporting more efficient work processes?
Halo Effect – We should be aware of our human tendency to gravitate towards new, fancy solutions. It is easy to become blinded by the brightness of their shiny halos and potentially overlooking their actual usefulness. New tools could potentially give data and subsequent information products a false sense of quality and accuracy. Bear in mind that the quality of the information rests on the quality of the data and not on the way it is processed or presented. Rubbish in is rubbish out.
Dependency – New solutions should be able to function when standing alone and should not increase our dependency on further technology to function properly.
Added Value – Technological solutions should add value to the response and optimise existing solutions. They should be a “need to have” and not just something that is “nice to have” – especially not if there is a simpler solution already available.
Some of these points may seem somewhat conservative and cautious in how they approach new technological advances, but by focusing on the populations and societies affected by disasters and other crises, they are trying to see things from a bottom-up perspective. If I were to sum up the discussion in one conclusive statement, it would be: “technology matters and can bring great benefits to the ways we work. However, solutions should be adopted because they meet an actual need, rather than just because they are there. “