From education to employment

How might the move to End Point Assessment impact upon Apprenticeships?

Richard Marsh, Apprenticeships Partnership Director, Kaplan UK

Currently only 1% of Apprentices are on the new standards, and are thus being assessed formally at the end of their Apprenticeship. As the SFA tips the scales of funding in favour of the new standards, we all expect that number to climb in 2017.

But what impact does End Point, rather than modular, assessment have on the apprenticeship experience? Why was the change introduced and what can we learn from similar moves across education? 

From end point to formative assessment

Thirty years ago GCSEs were introduced so that every school leaver would achieve the same qualification; and to ensure that these GCSEs would work for all pupils new forms of assessment were introduced, such as controlled assessment and coursework, ending the ‘all or nothing’ end point exam tradition.

What happens in Schools generally moves into FE and so GCSEs led to the rise of the NVQ and its formative, modular assessment model found fertile ground in Colleges, NVQs and Apprenticeships. 

From formative to end point assessment

Fast forward to 2012, and Education secretary Gove said about GCSEs (Parliament)

“..controlled assessment undermine(s) the reliability of the assessment as a whole. That’s why I asked Ofqual.. to judge how we might limit course work and controlled assessment; It is proposed that course work and controlled assessment will largely be replaced by linear, externally marked end-of-course exams.”

And Doug Richard said (in the Richard Review):

“Continuous and time consuming assessment, driven by paper-based tests, accumulated ‘evidence’ and assessors with a vested interest in apprentices passing the test, demeans the apprentice’s accomplishment.  Instead, there needs to be a test… It should be primarily at the end of an apprenticeship, not measuring progress during it.”

What impact might we expect to see in Apprenticeships?

It is too early to see the results of the move to end point assessment in GSCE outcomes; but in 2016 there was big reduction in early entries, increasing the average length of GSCEs.  

Duration

May this may well increase as providers and employers make sure that their learners are as well prepared as possible before booking their EPA. It is also possible that there might be a long waitlist for EPAs appointments in some subjects (as well as a lack of choice as to your EPA partner). Increased duration is generally seen as a good thing in Apprenticeships, and there should continue to be a healthy tension between cash flow and timely success rates.

Quality

The pedagogy of Apprenticeships is well established and I do not think that there is any evidence to suggest that current training and assessment techniques can or should change radically. Employers and providers do seem to be looking to deliver more learning online and to encourage greater self-study. And both of these things can be achieved if the average age and academic mean of apprentices continues to rise and a more mature and confident apprentice cohort accepts greater responsibility for their learning.  

A clearer distinction between training and assessing should also emerge, as will the possibility of failure for those not committed or skilled enough to achieve their apprenticeship

Currently apprentices rarely fail. They either drop out or don’t achieve their functional skills. This lack of failure was a criticism of Richard.  Perhaps an increase in the failure (rather than drop out) rate will improve the perception of apprenticeships as he suggested…..

Academic failure is heavily sanctioned in the education world and the SFA currently promise to end the contracts of those with less than a 62% pass rates in Apprenticeships. This kind of pressure creates caution, restricting the take up of new standards and the creation of new models of delivery.

If an Apprentice is unlikely to pass is there any point in them undertaking their EPA anyway? And how would we know if they are ready – if not through formative assessment and accumulated evidence….

Where the value of the Apprenticeship itself is less than the value of a qualification contained within it (such as a University Degree) what value will the individual perceive from their EPA and FISS Certification if they have already donned cap and gown to receive their degree. In this scenario completion rates could fall, but it wouldn’t be a reflection of rising quality.

Capacity

The biggest impediment to the move to a widespread EPA model is capacity. Currently Awarding Organisations (AO) quality assure the assessment of apprentices through a system of Internal and External verification that has been built over decades. It would be a huge undertaking to move this capacity from providers to EPAs and the reduced provider role does not necessarily offer a balancing level of savings if they will still need to assess learners before releasing them for EPA.

Pragmatically we are surely likely to see ‘EPA recognition’ of good training provision eventually.

The current system is based upon the assessment of progress against the criteria set out by an AO (although this varies by sector it is a common element of all ‘frameworks’). The inertia around Standards is in no small part due to the lack of these pathways and the size of the investment that is required to create 2/3/4 year apprenticeship learning programmes for those providers that do not possess their own materials.

Potential EPAs are not commercially motivated to provide these learning pathways and so it is providers and employers who are being asked to each invest individually in the creation of their own new standard programmes and materials.  

Implementation

It may be that we do not get to judge the impact of the switch to EPA for a long time due to a lack of evidence, as so few apprentices undertake standards that have replaced frameworks. 

We know that gov. wants employers to have a direct relationship with EPAs but at the moment providers remain the EPAs paymasters and so it is that relationship that remains key.  And the two parties are engaged in a Mexican stand-off:

The cost of developing materials to replace AO resources for lower value Standards is holding back providers and the cost of replacing provider’s Internal Verifiers (IVs) is holding back AOs

Perhaps the opening of the Levy will grease the wheels but I suspect that further flexibility or support will be necessary (from gov.) before we ever see a critical mass switch from the old to the new.

Where we are likely to continue to see most progress in the deployment of new standards in job roles where there is already a well-proven vocational pathway, well-resourced training providers and a well-funded standard.

Richard Marsh, Apprenticeship Partnership Director, Kaplan UK 


Related Articles

Responses