Social Impact: Will you be the first or the reluctant last in FE to measure yours?
Some would say that Lord Leitch challenged what he saw as the complacent established view of a good enough learning and skills provision. “Economically valuable skills is our mantra,” he said in the foreword to his 2006 report. The response to what was a widely respected review has been positive and far-reaching. It has not, however, yet extended to measuring those economically valuable skills for which the FE sector is striving.
Would that be measuring the immeasurable? Many, including the Sector Skills Councils, surely the bastions of Leitch’s principles, think not.
On 2 February, as explained in an earlier editorial on FE News, the Alliance of Sector Skills Councils published a research study commissioned from Baker Tilly’s charity and education team which looked at the social impact of the Sector Skills Councils. Showing annual returns of over £100m each from combined Government and industry funding of only £5m, it has coloured considerably not only Central Government’s view of the value the SSCs bring, but also their own self-image. In the foreword, Alliance Chief Executive, John McNamara, laid down a challenge to each organisation in the Third Sector “to ask what impact it is having, and to determine how each wants to apply this approach”.
Social Impact and the SROI approach
Social Return on Investment (SROI) originated by New Economics Foundation in 2002, and now embodied in a publication of the same name by The Office of the Third Sector in May 2009, is rapidly becoming the mainstream methodology for social impact evaluation.
Essentially it looks at the Outcomes that are being achieved by the work (Activities and Outputs) being done, and links those in to financial measures, termed “financial proxies”. These measure the economic, social or environmental impact of the work being done (the intervention). We have found that three measurement approaches recur in all cases:
- Economic or social impact – such as improved productivity, or lack of time lost to retraining, as a result of more effective training the first time
- Wastage avoided – such as improving the success rate on qualification-based courses so that the funding afforded to them has a gain in terms of better trained workers
- Market priced services – where a service is being provided free or at reduced price which is otherwise available in the commercial market at a market price. Certain courses such as ESOL or return to work courses might fit here.
From the total evaluated impacts are taken three deductions: deadweight (“… it would have happened anyway….”), alternative attribution (“… it happened but it wasn’t just your doing…”), and displacement (“… you achieved it, but unfortunately it also caused…”).
The life of the impact is then considered, and reduced to a net present value by applying an appropriate discount rate to the projected future cash flows.
How can SI evaluation work for an FE college?
It is essential that the measurement is done from within, after the style of Action Research, supported by appropriate expertise, rather than as an exercise that is “done to” an entity. This improves the quality of insight into the evaluated projects, and secures a greater level of buy-in to the evaluation and its aftermath as a result.
What do you measure? Surely not everything: you’d be there forever… It is better to select key curriculum areas of larger scale groups of interventions, and consider in detail how they achieve their outcomes and for whom. When looking for sound measures to relate to the outcomes, think of the wealth of economic and social data that is already available at a Local or National level. Don’t expect it to match directly: you will expect to need several steps in the reasoning. Typically, if this takes, say, six steps of reasoning, two will typically be based on data held by the organisation, two referenced to publicly available research, and two will be reasonable estimates or judgments. Accepting this makes the methodology workable, if perhaps a little insecure for those who like exact answers, and gives an enlightening vision of the college’s targets and achievements.
Pitfalls to avoid
The standard SROI approach, as included in the Cabinet paper brings the whole calculation down to a ratio of evaluated outcomes over inputs. The danger with this, as opposed to simply saying that £x of gain has been achieved from £y of input, is that it demands a comparison be drawn between organisations. Even between colleges, their fields of operation, their approaches to their missions, and the way the evaluation has been done and the estimates made lead to the conclusion that such comparisons would be at best spurious and at worst very misleading.
In addition the approach needs to take account of impact risk (through probability-weighted outcomes, for example, or variations in discount rate), interaction with brand (probably the subject of a separate article), and a reasonable approach to what is a predictable and proximate outcome form the intervention, and what is just too remote to be evaluated sensibly.
The message for the sector
“Evaluated sensibly…”: maybe that is the key message out of this. This form of measurability, or a variation on it developed by and for the sector, is surely the shape of tomorrow. The “sensible” approach is not just to apply the methodology sensibly, but to embrace it and apply it with a true sense as to what is being measured, before that approach to measurement is dictated by others, and emerges in a form that doesn’t quite work. So will you be the first, or the reluctant last, to step up to the plate?
Jim Clifford is the head of charity and education advisory services at accountants Baker Tilly, the leading advisors to charities, independent schools, and FE colleges. He was lead author of the report for the Alliance of Sector Skills Councils, published 2 February, on the social impact of the SSCs, a high impact piece of research that is significantly influencing funders’ views as to the value brought by the SSCs
Responses