The current wave of changes to Apprenticeships represent more than just a shake-up of course content and funding provision, substantial though the change to these areas are.
They also represent a major shift in the assessment of vocational skills, from a situation where training terminates obligatorily in a nationally recognised qualification, to a situation where achievement of the required standard - ‘competence’ - is itself the valued outcome.
What is the rationale for dropping qualifications?
Put simply, the old system wasn’t really working. It was largely led by the training providers over employers, was modular and piecemeal, and so complex it left even the most seasoned Human Resources Professional in a dizzying spin. There was also less of a focus on behaviours (Employment Rights and Responsibilities aside) and competencies that are more prominent in the new standards at the behest of industry.
In his 2012 review, Doug Richards was highly critical of the existing qualifications system, even going so far as to call it ‘the opposite of effective’.He found there to be too many qualifications per sector, which in addition could be ‘stitched together’ to give an even more confusing array of outcomes. He also bemoaned the overly detailed qualification specs and occupational standards, claiming that they led to ‘bureaucratic box-ticking’ and an over-focus on assessment at the expense of learning and development.
His recommendation was, therefore, to turn the system on its head, giving employers the power to design and develop qualifications that are more fit for purpose. In fact, the move towards employer-led apprenticeships had already begun a few years earlier, with the launch of Skills for Growth: the National Skills Strategy in 2009, by the then Labour government.
The strategy was all about helping people to achieve skills that match the demands of the modern workplace: “the skills demanded by modern work in a globalised knowledge economy,” as per the foreword to the report.The report went on to state: “We need a system which is driven above all by the demands of the market. We need to give businesses more power to shape the provision of training through their choices and priorities.”
This focus has been carried on by the Conservative government ever since, and a key factor in the commission of the new ‘standards’ and the rebalancing of the system to put employers in the driving seat when it comes to the formulation of training and assessment content.
However, if employers really are to be in the driving seat, it necessitates the removal of any artificial constraints imposed by the qualification system. It would only hamper the design of standards if the designers of them (representatives from industry rather than from awarding bodies) are obliged to reference or benchmark or align across other subjects and industries. If the cart is not to pull the horse, assessment of the apprenticeship standards needs only to be self-referencing - it needs only to measure what it is important to measure.
Why external end-point-assessment?
"The testing and validation process should be independent and genuinely respected by industry." This was number 4 in the top 10 headline recommendations of the Richards Review in 2012. If qualifications exist because they provide information to employers or higher education providers about a candidate’s knowledge, skills and suitability, then they are only of value if they are trusted and transparent. If they are not, then they become something of a rock-breaking exercise – assessment for assessment’s sake.
Unfortunately, employers were losing faith with the old-style NVQs, Diplomas and Certificates and so on for the reasons already mentioned. The quantity of assessment – or evidence-gathering – had a detrimental knock-on effect on the amount of time actually spent learning skills. They were additionally seen as being flawed in that the assessors often had a vested interest in seeing candidates pass.
Much care has gone into the planning of end-point-assessments to ensure that they are completely impartial. The end-point-assessor will have no prior knowledge of the apprentice and have absolutely no stake in the training. What’s more, the ultimate decision maker over the provider of end-point-assessment is the employer, not the training provider. As an aside, the changes have created the need for a new army of ‘super-assessors’ who combine detailed and up-to-date industry knowledge and experience with advanced skills in assessment.We will no doubt see a big recruitment drive to fill these roles over the next year or so as the first wave of training based on the apprentice standards reaches fruition.
Is assessment therefore only at the end of an apprenticeship?
Absolutely not. While there is complete freedom and flexibility for the employers and training providers to work on-programme assessment into the delivery of training, it is common sense to keep track of apprentices’ progress towards meeting the required standards with regular benchmarking. Without this, it would, in fact, be very difficult to determine candidates’ readiness for end-point-assessment.
In addition, on-programme assessors would ideally be drip-feeding practice assessments to apprentices throughout their training to ensure they are comfortable and confident with the format, whatever that may be: interviews, multiple choice tests and so on.
However, it is important that this ongoing assessment does not grow disproportionately large. It does not count towards the end-point-assessment and should not be treated as such. It should only be used as formative and/or diagnostic assessment – and as a reference for the end-point-assessor to show that the criteria have been adequately met for passing into the end-point-assessment phase.
Won’t apprentices miss out when it comes to having a concrete qualification to put onto their CVs?
Under the new system, there is no requirement for a standard to end in a qualification because the fact of having completed the apprenticeship already denotes that the standards have been met. Indeed, whereas previously, qualifications were either passed or not passed (marks at a modular level being masked in the final result), the new end-point-assessments will carry with them a differential grading of pass, merit or distinction.The fact that the end-point-assessment doesn’t necessarily equate to a nationally recognised qualification such as an NVQ or Diploma, doesn’t mean that some training courses won’t include them.
When employers and trainers design the curriculum or pathway through the standard, they may well decide to include professional qualifications and certificates as part of the ‘gateway’ that is to say the body of training that needs to have been successfully completed before end-point-assessment can take place.
Are we going back to a formal, written examination system?
On the very face of it, the introduction of a synoptic end-point-assessment may look like a retrograde step, a shift away from the idea that modular assessment, particularly via coursework, offers a fairer evaluation of performance over time. However, it takes only a light scratch of the surface to reveal that the situation is far less reactionary than it appears.
While the final pass/fail assessment for the apprenticeship occurs at the end of training, very much like a GCSE or A-level, it does not necessarily take the form of a written examination. Each standard comes with an assessment plan which specifies the format of the end-point-assessment to be undertaken. While this may include a written exam, it also includes things like a business project, interview or professional discussion, a competence activity, or a portfolio of work, and generally combines a number of different approaches, giving apprentices a range of ways to demonstrate their competencies