SPRING Programme Reflections

SPRING Programme Reflections

As we draw our evaluation of SPRING to an end, for this final blog in our series, Gordon Freer, the evaluation team lead spoke to Heidi Ober the evaluation’s Project Director about her time with the programme.

GF: You were the Project Director for the SPRING evaluation. What is your “elevator pitch”?  How did you describe SPRING to outriders? And how did you describe the TT role?

HO: As an experimental business accelerator programme which was meant to adapt as it went along, there was so much in SPRING to try and describe it in an elevator pitch. We ran alongside a complicated multi-faceted programme asking evaluative questions which fed into the programme’s adaptation. We played a key role in informing the decisions the implementers made. As such, it changed the way the programme was implemented, the way it supported businesses and the types of businesses it selected.

GF: This was a longitudinal evaluation – TT were involved from the very beginning.  How did this differ from other evaluations you have been involved in?

HO: I think there were two key differences; the first was our role, and the second was our methods.  In the first difference, we played a role of ‘critical friend’, carrying out the evaluation less for accountability and more so for learning. This feeds directly into the second, because we were focussed on learning, and sharing these learnings with the programme, as the programme pivoted and adapted, so our evaluation needed to do the same.  As a result our methods, processes, timings, everything was changeable.  The challenge here of course was remaining robust, while being flexible and adaptive ourselves.

GF: From an evaluation or methodological point of view what was the most challenging aspect of your work?

HO: Without a doubt it was timings. Since our evaluation was meant to feed into the adaptations, there was a lot of pressure on us to carry out evaluation which was rigorous enough but fast enough to remain relevant to an on-going programme which was constantly adapting. The cohorts, or time periods, for working with programmes was nine months. We were meant to carry out evaluation and distil learning for the next cohort. It often felt like we were running a relay race.

GF: For the last five or six years you have been walking this journey with SPRING, as part of the evaluation team, what lesson or lessons will you take away with you, that you can share with others who might be embarking on a journey to empower girls, or working with the private sector?

HO: A couple key lessons. Businesses and development implementers speak different languages, and often, it can be challenging to get them on the same page. Businesses may be motivated by different things than development actors. And it is important that the development actors do not lose sight of this. A few examples of this are around collecting indicator data for accountability for a logframe. Businesses know they are participating in a development programme. But many only collected data which was relevant to their business and decision making. Getting them to collect more information such as ‘gender’ and ‘diversity’ was challenging, and there were data gaps. This made accountability to the donors more challenging. Development donors need to understand this, and think of solutions for tackling this, if it is a requirement. Perhaps deprioritising that information or if necessary, ensuring the programme monitoring processes are sufficiently resourced to do it themselves. This ensures the data is collected and at the same time does not overburden businesses.

A second lesson is to think about what a ‘good enough’ approach might look like to using development tools. Project designers and implementers need to take the time, resources, and capacity of the businesses into consideration when planning the roll out and the implementation of tools.  The example with this was Human Centred Design (HCD). The implementers trained businesses on using it and businesses agreed it was very beneficial but also commented on the amount of resource it consumed. Perhaps engaging businesses with a ‘good enough’ version of HCD would be better in future programmes? It might not be the ‘gold standard’ of HCD, but it might be more useful for businesses, freeing up resources to be used elsewhere in the programme.

GF: If you had to do this all over again, what changes or alterations would you make to the TT role as evaluators?

HO: I think perhaps adjust our approach and methodologies. Given the short timeframes of turnaround for design, data collection, analysis and sharing of learnings, it was very tight. We often found ourselves running to feed into the next cohort. Perhaps I would not be so ambitious with the sample size and methodologies next time. This of course all feeds into our initial evaluation design. As the programme adapted, we changed our evaluation processes as well, in consultation with the implementing partner and the donors, but perhaps we could have done this from the very beginning.  Rather than trying to remain true to a model that was developed in isolation, before the programme started, we could have reflected regularly and determined what was best for the next period of evaluation.

GF: Finally – You must have so many memories of SPRING, the evaluation team, the implementing team, the donors, the businesses, and importantly of the girls. What are your favourite, or perhaps your most memorable moments?

HO: Wow. So many. I think that despite all the challenges with carrying out the evaluation of the programme, I would say that it was a pleasure working with everyone involved – the evaluation team, the implementation team – all believed whole heartedly in SPRING and what it was trying to accomplish. It was an ambitious programme and luckily, there were equally ambitious people involved. Everyone truly believed in SPRING and brought their best to it. It was inspiring to work with so many people that gave of themselves. And it showed, SPRING was a concept, an experiment and a success due to all of those giving of themselves. And ultimately, it impacted the lives of over 2 million girls. I feel lucky to have been a part of the journey.