This blog post is written by Johan Eldebo and Sarah Pickwick, who both work for World Vision. Johan (firstname.lastname@example.org) is World Vision International’s Regional Security Director for Southern Africa and Sarah (email@example.com) is World Vision UK’s Senior Conflict Adviser. Through this post the authors aim to share some of their experiences and to encourage discussion on the subject.
Note: this article represents the views of the authors and not necessarily the views of any organisation, including World Vision.
Providing aid in dangerous and difficult to reach areas is essential, particularly given that the majority of people in need of aid live in areas where authorities are unable or unwilling to provide basic services. These areas are also commonly plagued by violence, meaning that delivering aid to such areas is often an insecure and fragile operation.
For such operations to succeed, it is essential that organisations understand the area’s context in terms of local history, economy, politics, and existing power dynamics as well as how it relates to the national and international context. This blog suggests a few criteria that organisations can follow in order to improve the utility of context analysis in fragile contexts for strategic and tactical decision-making, with a focus on rapid, participatory and flexible analysis.
Understanding context requires looking at the past, the present, and, as much as possible, looking to the future. As well as being a highly complex task in general, it is also subject to frequent change; analysis that was accurate in the morning can be invalidated by the afternoon. Therefore, being ‘right’ about context is extremely difficult, but much more than this, being right is not enough. In its endeavour to become more accurate, analysis in humanitarian contexts (paradoxically) risks becoming ineffective.
‘Analysis’ refers to a product designed for the purpose of influencing a particular set of people on a particular subject at a particular time. The humanitarian sector struggles with unused analysis. Sometimes this is because decision-makers want the information simply to validate existing plans, and therefore ignore ‘inconvenient’ information that may inhibit them, and other times it is that the analysis produced is overly esoteric, impractical, or has been made available too late to inform either strategic or operational plans.
As such, the sector seems to agree that we all need better analysis. However, in a bid to maximise accuracy, ‘better’ often seems to mean more complex, more time-consuming and, consequently, less accessible to the would-be users. This blog post argues that the sector should therefore strive to find ‘good enough’ solutions that are light-touch, locally-grounded, user-friendly and timely. To get closer to a point where analysis reliably informs strategy and operations, this blog offers three broad principles to guide effective, ‘good enough’ operational and security-based context analysis, that would be intended to complement, rather than replace, more rigorous processes.
Guiding principles for context analysis
1. Analysis with ownership leads to action
It is not uncommon for the entire process of analysis to be completed separately from those who would be using the information. Whilst unfortunate, this is all-too-often necessary due to operational staff struggling to find time to complete a more rigorous analysis process. At the same time, the knowledge of these struggling operational staff is critical to inform good analysis, and their buy-in is crucial for any action to result from it. This means that if those for whom analysis is intended have a sense of ownership over it, this ownership would drive them to take accountability for its implementation. Therefore, the purpose, timing, methodology and format of the analysis should first be established and agreed upon with those who will ultimately use it.
As well as operational staff, buy-in at the senior level is generally required for validating and signing off content, and approving the timing and intended audience. However, senior leaders’ involvement will be dependent upon their role in the validation and action-planning elements that shape the recommendations, their understanding of the process, and their ability to contribute to the content.
2. Analysis should be timed with strategic and tactical decision-making
Ideally, all decisions should be reached following thorough examination of complete data. In reality, much analysis that strives to achieve this becomes so time-consuming that it is not ready to be used at all by the time decisions must be made. ‘Good enough’ analysis is therefore often the best available solution at the time of decision-making.
Whilst comprehensive, proactive analysis at all levels is indeed crucial, particularly for strategic decision-making, there is a clear need for more short-term, reactive, and sometimes localised analysis that can anticipate and respond to sudden changes in the context. In many cases, the availability of good analysis is only one factor in decision-making, which can also be impacted by expediency and internal and external pressures. Since many of these decisions will therefore be reached whether informed analysis is provided or not, it is essential that interventions are not context-blind.
This requires analysis to take place when organisations move into in a new area, when organisations are revising their strategy, or when there is a sudden or anticipated change in context. Crucially, it should also translate into continual monitoring of the context, rather than act as a one-off exercise. Analysis that is intended for decision-making must, therefore, consider the political context in which it will land, and be presented accordingly. Analysis presented to decision makers at the right time, in the right way, considering other factors that may be at play in decision-making, can and should help them to make a more informed decision.
3. Analysis should be practical and pragmatic, both in its process and final outputs
Sometimes analysis processes are over-complicated, and sometimes the analysis produced is unsuitably esoteric and/or impractical. The time constraints of the intended audience, as well as the level of detail that is useful for them, should be considered when completing analyses. For analysts themselves the same considerations must be made when designing the assessment methodology.
These considerations are particularly important if the intended audience is local field-based staff and the authors of the analysis are Western headquarters staff, as is often the case. The latter’s limited access to local information can lead to less practical analysis that does not meet the needs of operational teams. Similarly, recommendations made in context analyses should recognise the internal and external political and practical realities that decision makers will be managing.
A balance must be struck where the analysis is ‘good enough’ in its responsiveness and agility, whilst providing enough detail that offices and senior leaders will feel confident using it to inform their decisions. A practical and pragmatic approach will ultimately help to ensure the swift translation of analysis into action. This can help with overcoming inaction based on decisions postponed until all the information is perfect. Since ‘perfect’ information rarely arrives, nor is there time for the decision to wait for it, this way of thinking can cause real operational and security risks.
In summary, the collision between ‘correct’ analysis and ‘effective’ analysis is perhaps most acutely felt at the intersection of information, timing and politics. When those for whom analysis is intended haven’t played any part in its development, it is provided too late to inform decision-making, or the process or product is so over-complicated that it becomes impractical, being right is not good enough. In order to be useful and effective, analysis should (1) be somewhat influenced by those using it, (2) be made available in time to inform strategic and tactical decision-making, and (3) provide practical advice written with the audience in mind.
As a study by Report the Abuse shows, only 16% of humanitarian organisations have even a single mention of sexual violence as a risk to their employees within their organisation’s policy and procedural documents, let alone a comprehensive, sensitive or survivor-centred response mechanism. Megan Nobert, Founder and Director of Report…
GISF has partnered with DisasterReady.org to launch two WebTalks on the why and how of good security risk management. Watch Gareth Owen from Save the Children UK speak about why security risk management is important for humanitarian programming, and Lisa Reilly from GISF talk about how to implement good security…
In the third instalment of a monthly blog series entitled Understanding Us: new perspectives on risk, safety and resilience, Meredith Moore introduces how evolutionary psychology can give some insight into the prevention of violence.