It’s the silly season again when half the country is on holiday and the other half are largely twiddling their thumbs waiting for them to get back, so that life can return to normal. For the newspapers it’s the time when the inside pages, if not the headlines, are usually full of the bizarre and the faintly ridiculous. And for the government it’s the summer recess when the slightly more crazy ideas developed over the past year can be given an airing, safe in the knowledge that not many people will notice what is happening and even less will feel inclined to protest.
Which brings me on to “Course Labelling”, the subject of much debate in the sector at the moment and the dream of UKCES that every potential student will be able to make a logical choice of which course to follow post 16 on the basis of facts that relate to job prospects, wage gain, success rates, learner satisfaction and of course the views of OFSTED.
If only life were so simple. Even in the area of the “easily measurable” – rates of retention and achievement – we have still not reached the point where the results obtained in schools can be directly compared with those in colleges (a problem with data collection in schools, as I understand it). Admittedly that would be useful, but when we go on to consider the question of destinations, a whole new minefield opens up.
If we want to judge how successful a course is by the job outcomes following the completion of a programme, then there are some fundamental issues to be addressed. When for example is a job not a job? How does shelf filling in Sainsbury’s part time on a Saturday compare with a full time job in Marks and Spencers’ with training? What happens if those finishing a course get an excellent job but in a field that has nothing to do with what is studied? (The destinations of Hair and beauty students always provide good examples of this). And how does this aggregate into a league table score to enable one course to be compared with another – the golden dream that is as likely to be as successful as alchemy?
When we move on to attempting to calculate wage gain-the financial benefits to the individual of investing in education and training, the task is pretty nigh impossible. There are the obvious problems of extracting the effects of other variables, but even were this possible what seems simple and relevant just isn’t. When is it proposed that the gain going to be measured, for example, who does it and reports on it and how relevant are the answers likely to be to those about to embark upon a programme when by the time the statistics are available the original course, staff or even institution may have changed beyond all recognition?
This all comes of course from a noble ambition – improving consumer choice through providing better information. Unfortunately it tries to create a degree of “certainty” in areas where over simplification is just not possible. What’s more, it also bears little resemblance to addressing the real questions that students most often ask and on which they frequently base their choice – Where are their friends going to study and what’s the social life going to be like when they get there?!
So do we drop the idea? Yes and no. Let’s publish comparable retention and achievement rates for all institutions as soon as possible – no excuses, like for like – and require every school and college to list the initial destinations of their previous year’s students on their website with “not knowns” clearly stated. But let’s not try and build a whole traffic light system around other, highly dubious data. To do so would create a bureaucratic nightmare and produce so called information that because of its quality would probably do more harm than good.
David Collins is chief executive of the Learning and Skills Improvement Service (LSIS)
Read other FE News articles by David Collins: