The Innovation Index Experiment

UNHCR’s Innovation Service has been undertaking an experiment. We created an Innovation Index; our first attempt at measuring how we’re doing when it comes to innovation in our field operations. We never released the product. This piece outlines our Index story looking at the whys, the hows, and most importantly the lessons we learned from the experience.

The Product: What is it?

The Index is a product that aims to provide a framework to analyse diverse approaches to innovation. It applies this to nine pre-defined UNHCR country operations to determine how they approach innovation across four key pillars: Community Engagement, Data Driven Implementation, Innovation Capacity and Space, and Access to  services. For each ‘pillar’ there are a number of indicators made of up publicly and internally accessible data. Operations are then given a score based on this data and ranked. The product includes anecdotes of specific activities and initiatives within operations and outlines some case studies that demonstrate good practice.

The Reason: Why did we do this?

The Index was a product of an innovation process in and of itself and started somewhat by accident. With the topic of innovation being featured prominently in the High Commissioner’s Strategic Directions, the Innovation Service understood that a new degree of rigour was required in respect to how UNHCR is collectively reforming itsprogramming and incorporating innovative approaches. We needed to understand better how the Innovation Service could have an added value regarding the support we provided to country operations.

Like many other units in UNHCR, the Innovation Service has to prioritise its activities and interventions. How could we do this? We realised that it was hard to determine with the information we had on hand where opportunities lay, and how we might be able to support, without having a clear understanding of how innovation is being approached in different field operations.

While we didn’t know precisely what might come out of this process, we realised when we started that creating an Index was something we needed to embark on. The Index acted as a container for some of the more significant questions marks we had as a team. By bringing them together we felt we could tackle them together and link them to UNHCR’s strategic vision of innovation, and how it contributes to making a positive impact on the lives of refugees. We’d seen other indices such as the Big Max Index, or Corruption Perceptions Index and saw how powerful they could be at telling stories. By doing this ourselves maybe we would be able to learn more about ourselves as a Service.

Ultimately, it also sought to improve how we can hold ourselves to account. UNHCR has made commitments to innovate more, and we were resolute that we should be able to measure how we are, and how we are not achieving this.

The Journey: How did it happen, and how did it evolve?

The accident of the Index started in the context of the Innovation Service looking to support a field operation. As part of this, we wanted to delve into UNHCR’s programming data in the operation to find out whether there was anything that might be construed as innovative, but there were limits to what we could infer from the data available. We then expanded on this to see whether we could discern anything from other operations’ data in this regard; it was complicated. There was a  considerable variation from country to country which also meant it was hard for us to compare; we needed to do this to help us prioritise where action was needed, and where opportunity lay.

This first assessment led us to realise that we needed to define better variables or ‘indicators’ for innovation, as well as the limitations of our existing programmatic data in terms of measuring these. The idea of developing an ‘index’ evolved thus, requiring the relative assessment of innovation across a number of operations. The approach was to start small and to expand the exercise if successful.

Internally, the Innovation Service had some debate about the indicators, and content of the Innovation Index. Every member of the team was enthusiastic about its potential, and everybody had their own idea of what the Index should be doing. To be completely candid, it was challenging to build something that resembled consensus. One team member stated: “Each member of the team needs to be able to stand behind each of the indicators and defend them wholeheartedly.”

The team made its best effort to make that the case. Variables included quantifiable ‘objective’ variables, such as the number of Innovation Fellows within an operation, as well as more ‘qualitative’ variables, such as the quality of UNHCR’s coordination of Communicating with Communities initiatives. Once these were determined, an exercise was undertaken to populate the variables, with data from a range of sources. During this phase, two issues were evident: there wasn’t data available for many of the variables, and the Innovation Service didn’t have consensus on what was being measured. The lack of consensus was particularly the case with regards to output focused variables.

While the concept of creating ‘Innovation Indicators’ developed, it was recognised that they could also be used for internal advocacy – to communicate both good practices as well as areas of improvement. One other key learning from this phase, is that innovation cannot be measured by outputs, although these are inherently less subjective they do not enable an assessment of the result of an intervention. It was at this point that we recognised the Index was inherently subjective.

After a number of revisions, the Innovation Service produced a draft copy of the Index, which subsequently was tested amongst a number of UNHCR staff of different genders, backgrounds, length of tenure in agency, and levels of knowledge around humanitarian innovation.

The Failure: Why we didn’t publish?

Ultimately the Index was never published. It was clear from the conversations that we needed to revisit the objectives and the purpose of the Index. With so many things on the agenda that were exciting both for us and for those we tested the Index on, we realised we were trying to do too much with one product.

Innovation is difficult enough to define as it is, and will probably remain a porous concept. Blurring certain elements of monitoring and evaluation, with looking at enablers’ to innovation, as well as stories of good practice: who knew what our admittedly loose target audience was meant to take away. Feedback showed us that consensus amongst the Innovation Service itself wouldn’t currently build the framework we need for innovation. There are too many projects and initiatives either inherited, politically driven or disassociated from operational delivery that feedback told us shouldn’t be incorporated into a set of criteria applied to operational delivery. They are too specific to the Innovation Service itself; we couldn’t see the wood for the trees.

The Success: Understanding impact

There was no aspect of success. This was a dismal failure.

When we confront failure, we need to understand that failure derives from both external and internal factors. We must to be able to ask ourselves the difficult questions like how engaged were we to make this a success, were we diligent enough, or at what point did we stop believing in the process, and why did we pivot late? So as much as the failure of the Index to ‘launch’ was due to the very complex nature of measuring innovation, it was also due to the project team; perhaps the setup or resourcing of team was wrong, maybe the skillsets weren’t right to make this work. As we asked these questions, and examined our roles in this failure, one of the surprising things about receiving feedback from UNHCR staff on the Innovation Index was how positive they were about the prospects. We were told not to stop working on this and to keep reevaluating its place in our organisation.

The Future: What’s next for the Index

While we didn’t publish the Index, the Innovation Service learnt a lot about the work thematic of innovation – particularly, how different parts of UNHCR Headquarters view innovation and UNHCR’s Innovation Service. Most excitingly, this helped bring a degree of clarity and understanding to the strategic direction of the Innovation Service. New products and directions were identified that can help support more consolidated objectives of how the Innovation team services the field, including tools to support UNHCR operations improve their practices. We want to be transparent about this and are open to comments and suggestions about how we can help. In due course, we’ll revisit the Index but most likely learning from this process by breaking the product apart. You’ll be able to find more examples and case studies on our website.

We want to start engaging with academic institutions and others to look at how we can develop a better understanding of enablers of innovation. And before the Innovation Service starts looking into actions of field operations, we want to start at the beginning: helping our field operations understand how they can be innovative through innovation capacity building sessions and toolkits. If there are things you’d like to see from us, please get in touch and let us know.