Gordon Freer, the SPRING evaluation team lead, takes a somewhat light hearted look at the final communications our findings, after nearly seven years of evaluation and ostensibly, never ending reports. And takes this opportunity to present the SPRING Final Synthesis Report and the final SPRING Evaluation video.
Anyone who is involved in evaluation, will know the pain and the ecstasy of writing reports. The pain of ensuring that the structure, flow, argument, and logic is meticulously organised, of ensuring that exact data is presented in the correct form, and that the argument is clearly stated to be easily understood by readers who have too much to read, too much to digest, and too much to comment on. And the ecstasy of finishing a good quality, well drafted, report showcasing the findings.
Too often the findings, the lessons and the areas for improvement are lost in the precisely crafted pages, buried in sections interspersed with context and methodology and limitations. How can we as evaluators more effectively communicate the important parts of the evaluation? How can we point to areas of the reports that contain the nuggets of information? We pondered these questions as the SPRING evaluators.
Remember the SPRING background: nearly seven years of evaluation, three evaluation components, hundreds, possibly thousands of pages of reports, numerous databases of respondents, and volumes of data. Numbers, pictures, diagrams, words, names, places, columns, data, recordings, formulas, theories – all merging and mixing into …. what?
We settled on creating two, interlinked communication pieces: a synthesis report, and an accompanying video. These two products were to deliver a synthesis of all the learning from SPRING but in easily digestible ways.
Before writing the synthesis report, we designed it and set a page limit. We wanted colour and icons so it was visual. We planned something that would encourage people to pick up and look forward to reading it. Then we started with the findings we wanted to communicate, and built the rest of the report around these. We identified and signposted, in all the earlier reports, where these findings were explained in more detail, and directed the reader to these sources.
We designed infographics and diagrams to illustrate the findings and kept the supporting text to a minimum. We eliminated the jargon, we simplified the language, and we redrafted the supporting text. Then we read the revised text with fresh eyes, ensuring it still conveyed the key messages. We put it all together to see how well all of these “parts” made up the “whole”. And then we tinkered, fiddled and moved things around, until we were satisfied. (It was during this last phase, that we realised our page layout artist has the patience of a saint!). Importantly, we also included a place holder to include a hyperlink to the video URL.
Once completed, the final version of the report acted as the foundational script and structure for the accompanying video. But the limitations of time (no one wants to watch long videos), and visuals (some ideas are nearly impossible to portray visually) meant we needed to prioritise some lessons over others.
Then discussions ensued about the use of this word or that term. Was this too “jargony”, would viewers understand that phrase? Numerous redrafts of the script followed, together with the first conceptual visuals which were refined and tweaked. We revised the script again, with small alterations, these insights into consideration, and with the final voice, ensured it aligned with the visuals, adding almost indistinguishable pauses to give emphasis in the right places.
After seven years of evaluation, three evaluation components, hundreds of pages of reports, and volumes of data, we are pleased to share the SPRING Final Synthesis Report and the final SPRING Evaluation video.