top of page

Aligning 4 tools for social impact

Aligning these four tools can accelerate program performance in social services.


There are four key tools that social organisations should use to drive outcomes and overall performance of their programs, in our experience -

  1. A detailed interactive service model - documentation of the theory of change and the tasks required to deliver the outcomes and operational templates and training

  2. Monitoring, Evaluation and Learning framework - defining what is measured, why and how the data is used to make decisions

  3. Operational Dashboards - data tools that present relevant information drawn from program data to make decision-making quicker and more evidence-based

  4. Quality Improvement System - process to use the data and qualitative learnings of practitioners to continually improve the service model.



Mis-alignment


Many organisations have some version of these four tools, but they are often developed separately by separate teams. An evaluation person or an academic might have advised on the right metrics to use, with a separate IT person building some dashboards, and the senior practitioner might have written up the service model or training materials for onboarding.


We've been working over the last year on a range of projects across these four tools, and we've been finding we often come up against friction in doing projects where the metrics have been imposed from the outside (say by funder reporting requirements) and the service model is too vague to be useful for linking metrics.

This disconnection between these tools leads to issues - the measurement system might not be well-aligned with the services delivered or the decisions that staff are making as they guide clients through a program. Dashboards may therefore be of little practical use as the data isn't timed with the need. A quality or evaluation process might be imposed from a funder not related to the on-the-ground realities. And a service model might be too high level (such as only at a program logic or theory of change level), or out of date, so staff don't rely on it to manage their activities. This leads to one of the classic performance problems in social services - poor fidelity to the service model when delivered in practice.


Aligning the 4 tools


Service models and operational manuals are often high-level static documents that sit on a shelf and aren't used dynamically to guide actions. We have been building a more sophisticated online software tool to document service models that we call the 'Dynamic Service Model' ('DSM') - this allows more detailed specification of tasks, process flows and the ability for all staff to use current, version-controlled information about the best practice.


The Dynamic Service Model (Tool 1) also addresses the alignment problem by enabling teams to link metrics down to the individual task level so it's clear what collection tools (e.g. survey or assessment forms) are collected at what stage, and what data is needed to make decisions at for each task - i.e. a MEL framework (Tool 2).


This alignment then makes it easy to brief in the data analysts to build data tools like dashboards that pull the right data and visualise it to assist with the tasks listed in the DSM. This is Tool 3, ie data tools and apps. This also provides a clear functional brief to IT teams based on what data is useful, not just what funder reporting requirements there are.


This data is then used to identify performance barriers, and analyse whether particular places or cohorts have issues achieving outcomes. The team can then review and develop new ideas and solutions to address these performance gaps through a formal service improvement process (Tool 4).


Even if you have already done work on some of these tools, it could be worthwhile to first go back to basics in your service model documentation as the bedrock upon which your metrics, data and quality systems can build. Then build your MEL and data plan up from there.


コメント


bottom of page