Misaligned Means Much Maligned
The Skills White Paper claims colleges have been treated as “second rate” despite impressive results compared to others. Does our misaligned quality model and music journalism offer an explanation for this conundrum?
In the last two decades the most important external measures of the quality of a college’s provision have been Ofsted inspection outcomes and student achievement rates.
Not long-ago colleges became the only part of the education system to have not a single college graded inadequate
Not surprisingly, as a result, colleges have significantly improved on both counts during that period. Not long-ago colleges became the only part of the education system to have not a single college graded inadequate, and the percentage of colleges judged good or outstanding has often been higher than the figure for secondary schools, occasionally higher than primary schools, and always higher than private trainers. Student achievement rates have also risen steadily to record levels. Almost all college students achieve their qualifications, and those that don’t always have poor attendance.
Given that performance you would expect to see government, MPs, national media and employers lauding “world-class” colleges in the way they describe our universities.
Sadly, this is not the case. Indeed, the recent Skills White Paper contained the horrible assertion “for decades this sector has been treated as second rate”. Ouch! Even though the paper was careful to avoid saying it agreed, it didn’t distance itself from the statement, or single out those responsible for the mistreatment.
Future success measures
The final section of the White Paper set out the future success measures, and these explain why our quality model has proved so inadequate, misaligned and disconnected. They are all volume measures – more sub-degree HE enrolments, fewer skills gaps, more students on priority courses, fewer NEETs. These have always been the key measures of the sector’s reputation, but ones almost totally ignored by our two main measures of success.
Ofsted inspections have really been about professionals marking our own homework based on what matters to a professional rather than what matters to the country. Frameworks have actively discouraged risk and expansion by inspecting what we do, rather than what we don’t.
For example, if an area of provision was passing 70 out of every 100 students, rather than 80, the measures encourage ditching the provision so 0 rather than 70 succeed. No hospital trust would get away with closing all maternity provision because it wasn’t very good. If we don’t build the million new homes we need, pointing out the beauty of the ones we do won’t disguise the failure.
Any quality model should reflect the national objectives
Any quality model should reflect the national objectives. If we want growth, then we must move away from metrics that ignore it. Sub-degree level higher education in colleges is shrinking fast, apprentice numbers are falling too, NEET numbers have reached shocking levels. That should be reflected in inspection outcomes.
Of course, some will argue that Ofsted now looks at how colleges identify and meet local skills needs, but this feels half-hearted and seems to be based on performative behaviours – depth of analysis of local need, number of employer engagements, what employers say – rather than cold, hard impact measures like increased enrolments.
This approach does colleges no favours. For example, we focus a lot on low maths and English resit achievement rates, taking the focus away from the fact that over half a million young people now have good GCSE grades, and well over a million young people have continued to study these vital subjects to 18, when they didn’t before. Surely one of the most astonishing educational success stories of recent times?
Equally the focus on apprenticeship achievement rates, rather than achievement numbers, reduces the contribution of colleges to the national priorities. The latest government data lists over 1,200 apprenticeship providers and those with the highest achievement rates are singled out for praise. Yet, inevitably, such league tables are biased in favour of tiny, marginal providers. Of the top 100 in the achievement table, 71 have fewer than 50 leavers.
The country needs apprenticeship volume
The country needs apprenticeship volume. Of the 1,200 providers listed only 136 have more than 500 apprentice leavers. Colleges account for 43 of these strategically important major organisations and take up 19 of the top 60 places when these large providers are ranked for achievement rate, way ahead of almost all the major private trainers. This hardly qualifies for the description “second rate”.
People in charge of devising quality models often dislike models that leave little room for personal opinion. Data always needs interpretation but if those judgements depart too much from factual results it damages transparency, reduces accountability, and forces wasteful investment in lobbying/challenging to secure a better interpretation. The apprenticeship data shows only two of the ten lowest achieving colleges were judged below Good for apprenticeships, so most were judged equal to, or better than, five of the top 10 performing colleges, clearly illogical and unhelpful.
Contrast this with the way we measure quality of financial management in the sector. We don’t allow colleges to differently interpret their indebtedness, surpluses or cashflow.
What might a better aligned quality model reward?
So, what might a better aligned quality model reward? One obvious answer is growth in the right areas and growth in the number (rather than percentage) of achievers in those areas. Comparative data might also help. Surely a college delivering four times the number of apprentices as another serving a similar population deserves more credit for that volume? Meeting local employer needs should be judged largely on the volume and growth of adult, commercial, apprenticeship and HE enrolments. If “engagements” with businesses don’t persuade them to part with their cash or their levy, can you really claim to have been effective? If you enjoy being oversubscribed rather than actively seek to meet growing public demand, shouldn’t that be seen as a bad thing? If we aim to be inclusive, public services should be trying to give people what they want, rather than denying access.
School success measures are very different to the ones set out for colleges. A common quality model will therefore not align with both, with colleges the likely losers.
If our model continues to undervalue growth and scale we will not get the reputation and credit we deserve. Instead, we will be like those music critics who rave about obscure, barely-listened-to bands and denigrate the Adeles and Sheerans that sell in huge volumes all over the world. Colleges are mass educators. We are meant to sell out big stadiums, not be satisfied with playing to small audiences! Our quality model needs to reflect that.
By Ian Pryce CBE
Responses