From education to employment

Overcoming the challenges of end-point assessment

Jacqui Molkenthin, JEML Consulting

Having worked in end-point assessment since its inception back in 2014/15, I have seen my fair share of challenges, and worked through a fair number of solutions, alongside the amazing sense of pride as you see apprentices complete and achieve.

It is fantastic to see the growth and evolution of the sector, and I am a passionate advocate of apprenticeships and end-point assessment.

Back in September 2018, I began monitoring the registers, and since then we have seen the number of apprenticeship standards grow from 330 to 506, and the number of End-point Assessment Organisations (EpAOs) from 165 to 254. This phenomenal growth brings with it challenges, and of course solutions.

Since the start of this year, we have seen changes and improvements to:

  • The register application process,
  • The introduction of a more rigorous and robust External Quality Assurance (EQA) framework,
  • Vast changes and improvements to the digital side of apprenticeships (for employers, providers and apprentices)

There are many other improvements, but I cannot name them all here!

But there remains a number of challenges, so I thought I’d share some of the challenges I have come across, and my thoughts around possible solutions. 

You never know, some may already be ‘in train’ by the ESFA / IfATE.

You may not agree with my thoughts on solutions, but it could start a debate and lead to some innovative approaches to solutions. (Please feel free to comment below)

Challenges and Solutions

1. Errors in assessment plans


Over the past few months every single assessment plan I have been using has contained errors. Some are relatively minor ‘cut and paste’ errors, but others are more significant, making it difficult to ensure the design of robust assessment tools.

Here are some examples (as of 22nd Oct 19):

  1. In the employability practitioner assessment plan page 13, it says K3 and K16 are measured by the presentation, and yet in the appendix on page 40 is says they are assessed by the knowledge test
  2. In the retail team leader apprenticeship, the appendix Ei has lost its last box on diversity, and in annex D (third bullet) it refers to an observation, but there is no observation in the retail team leader EPA
  3. In the Learning and Development consultant Business Partner, the assessment plan page 4 says the presentation is 15 minutes, but page 20 says 25 minutes


I would recommend that the IfATE put in place a final independent proof read and check of the assessment plan before it is published.

2. Standard and Assessment plan reviews


We know that the standards state roughly when they are due for a review/renewal (e.g., within 3 years), and we know that the IfATE carry out statutory reviews, but other than a mention in the IfATE business plan, I can’t spot a review cycle calendar on the IfATE website.


I think that a published calendar of reviews would be particularly helpful for EpAOs and providers to support business planning. EpAOs may need to allocate resources to amend EPA tools, or determine when / whether to apply to assess additional standards; and providers may need to allocate resources to update/amend the curriculum / schemes of work, and both may need to revisit costs to ensure financial viability should the funding bands be reduced.

3. Standards without an approved External Quality Assurance (EQA) provider


I cannot locate a published process for assuring the quality of EpAOs where the EQA provider is yet to be approved.

Looking at the 17th October 2019 IfATE standards spreadsheet, there are 31 standards who do not yet have a named EQA provider, and a further 64 who have a named EQA provider who is not yet approved, 65% of these had apprentices on programme as of July 2019. The EQA manual page 41 refers to the IfATE working with stakeholders to support EQA activity in the interim, but this relates to the withdrawal of recognition of an EQA provider, rather than the process whilst an EQA provider is awaiting approval.


I would have thought that a logical approach would be for the IfATE to deliver the EQA functions whilst the EQA provider is awaiting approval, but it would be good to see a published approach so that the quality of EPA across all standards, not just those with an approved EQA provider, can be assured.

If anyone is aware of a published process, please add a comment to this article.

4. Changes to assessment plans that are not part of the IfATE statutory reviews


The EQA manual page 34 refers to minor, intermediate and major changes to assessment plans as part of continuous improvement. This responsiveness is valuable to ensure assessment is deliverable, especially if it is the first time of delivery.

However, this brings with it a range of challenges for EpAOs, many of which have been well documented by EpAOs:

  1. When do the changes get implemented? The EQA manual refers to timeframes for the solution but does not appear to state the timeframes for implementation. EpAOs will need to allocate resources to make the changes to their assessment tools, and as such clear implementation timeframes are critical. There is also the apprentice to consider here, under the ESFA funding rules apprentices are funded under the funding rules applicable at the time they started.
    Solution: Should the ESFA/IfATE consider a similar rule for standards or assessment plans, whereby the apprentice is assessed under the same assessment plan that was in place at the time they started their apprenticeship?
  2. Who is responsible for notifying employers and provider? It does not state this in the EQA manual or the EpAO conditions.
    Solution: I don’t think it matters who, so long as it is clear and consistent. I expect that much of it could be automated using ILR data to identify the providers delivering the standards where changes have been made to the assessment plan.
  3. How does this fit with EpAO standardisation activities? All EpAOs are required to undergo annual (as a minimum) standardisation of their assessment tools? If there is a change to assessment which means that their EPA tools and materials have to be reviewed an updated, does this restart the annual standardisation clock, or does it continue under the original annual standardisation cycle? 
    Solution: I would recommend that it is annual from the last / most recent update / change.

5. The role of the EQA provider in determining the methods of delivery of the assessment


Should an EQA provider instruct an EpAO how to assess, or should an EQA provider check that the EpAO is compliant with the assessment plan requirements?

For example:

  • People First, have a common approach document detailing how EpAOs should assess.
  • Open Awards “observe the EPA process in order to ensure that due process is being followed by both the IEPA and the EPAO in respect of the assessment methodology being compliant with the relevant apprenticeship standard assessment plan”, and
  • Ofqual have a set of qualification level conditions which state that the EpAO must “interpret the assessment plan”.

The EQA framework page 9 says does state that “where needed, the EQA provider may provide guidance to EPAOs on the design, development and implementation of methods of assessment to ensure consistency of interpretation”, but the IfATE website also states “Whilst the policy intent is that there should be different EQA models available to reflect the needs of different sectors it is important that any action taken on the back of these different models is consistent”.


I think that this requires greater clarity / consistency so that it is a level playing field, and so that EpAOs know exactly which set of documents to adhere to when designing their EpAO tools, and when carrying out internal quality assurance and standardisation

6. EpAO rating


At the moment there is no nationally recognised way of establishing the quality of the EpAO. The EQA provider uses a risk rating system for reporting to the IfATE on the quality of the EpAO, but this is not published, although the EQA framework does say “The Institute plans to publish reports or elements of them at some point in the future”.


There are some non-governmental sites where you can rate an EpAO, but it would benefit from a nationally recognised grading / rating system, sooner rather than later, much like Ofsted in the provider world.

7. Clarity in the conditions for being on the register of end-point assessment organisations


There are a few areas of the conditions that would benefit from greater clarity.

  1. Contracts – the conditions state that the EpAO must contract with the provider (section 4), but what it does not say is that they must contract with the main The funding rules make it clear that it must be the main provider that contracts the EpAO, but EpAOs are bound by, and follow, the conditions as opposed to the rules, so there is the risk that an EpAO may mistakenly think that they can contract with a provider that is a subcontractor. Solution: A small amendment to the conditions to specify ‘main’ provider would remove this risk of confusion.
  2. Record keeping – Who retains what? I have encountered a few debates between the provider and EpAO around this, due to differences in interpretation. The provider funding rules state that provider must retain an evidence pack which includes records and evidence of completion, but it does not say evidence of the completion of end-point assessment activities, so it could be the completion of the training/learning. The EpAO conditions state that the EpAO must retain records to prove they have seen and checked gateway requirements, and that they retain information about the end-point assessments undertaken, but that does not mean that they have to retain the gateway evidence (they just have to prove they have seen and checked it), nor does it necessarily mean that they have to retain the apprentice EPA products (they have to retain the details of the assessment undertaken, which could be interpreted as just the assessment report). This leaves the debate about the retention of documents, such as an apprentice end-point assessment project, showcase and so on. Solution: I think it would be helpful to make it clearer in the conditions and/or rules to avoid future debate.

8. Ability of EpAOs to deliver EPA for an entire standard


With the growing number of what I call ‘hybrid’ standards, where there are multiple specialisms / pathways and sometimes differing assessment methods across those specialisms within one standard / assessment plan, is the register application process acting as a blocker to new entrants?


  1. Is it time to consider whether the register should allow EpAOs to offer EPA for certain specialisms within a standard rather than the entire standard?
  2. Or, is there a place for some form or type of subcontracting in end-point assessment, whereby the lead EpAO is has overarching responsibility for governance, policies, standardisation etc, and subcontracts the assessment expertise to occupationally specialist organisations? (the conditions para 3.5 do not allow the contracting out of EPA)

I am expecting you to be shouting ‘no, no, no’ at my suggestion 8b, and I am no fan of subcontracting, but it would be great to come up with an innovative solution that continues to enable occupational experts and specialists to remain in / enter the EPA marketplace. You never know an innovative approach may also help alleviate the reported struggles with the recruitment of occupationally competent assessors.

9. Market intelligence


There are 140 standards (28% of standards) without an EpAO, half of these were approved before the end of 2018.


One of the best ways to attract new EpAOs, aside from funding, is to make sure the market intelligence is readily available so that organisations can make a business decision as to whether to apply to become an EpAO.

Recently the ESFA have launched a fantastic tool showing apprenticeships approved, alongside apprentice volumes per apprenticeship, but what would be great would be to add details of the providers delivering to those apprentices and geographical coverage.

I hope I’ve given you some food for thought and debate. It is always important to remember that no challenge is too great to overcome, and that every problem has a solution.

Jacqui Molkenthin, JEML Consulting

Jacqui Molkenthin Newsroom Strap

Related Articles