From education to employment

Turning an assessment plan into assessment tools – top 10 hints and tips

I have worked with many EpAOs designing and developing assessment tools and materials, so I thought I would share some of my hints and tips to help other EpAOs as they design and develop.  This will not cover everything, but hopefully provides a flavour, to support your processes.

1, Check, Check and Check Again

Go through the assessment plan with a fine-tooth comb and check that there are no errors that could cause problems later. I have lost count of the number of assessment plans with errors where KSBs are mapped inconsistently between the main document and appendices, scores that don’t add up, inconsistencies in assessment timings, or weightings that can’t be effectively applied. If you do find errors, contact the EQA provider for clarification. It is far better to sort it at the start than to hit a wall half way through!

2, Be very clear on the assessment plan and standard version number you are working to, and any temporary flexibilities or dispensations

There are a number of standards with multiple versions now, and it is easy to get confused, so be very clear from the outset which version you are approved for, and designing / developing to. This will also impact your customer base as some apprentices may have started on differing versions of the standard, so you will need to be clear from the outset. All versions and details of flexibilities and dispensations are detailed on the IfATE website. As part of this, you must be mindful of the status of the standard in relation to review as that will affect your future plans. Details of reviews are also available on the IfATE website.

3, Identify the Right Experts to Design and Develop the Tools

When designing and developing assessment tools and materials, remember, whilst sector expertise is paramount, so too is expertise in designing differing types of assessments.  Don’t forget that some assessment plans also require the EpAO to work directly with employers when designing assessments. It is absolutely critical to get the assessment methods right, and you can only do this with the right set of expertise in place. I have seen questions raised by Ofqual in recognition rejection letters relating to assessment expertise, so you can see just how important it is to have the right expertise in place at the outset.  Please always remember to have the appropriate confidentiality agreements, NDAs and conflict of interest checks and mitigations in place for all those involved in design and development.

4, Identify Experts for the Assessment Delivery Mechanisms

Don’t forget that design and development is also about the systems being used to deliver. You will need access to expertise in assessment delivery systems to design to a range of delivery methods, whether that be face to face, remote, paper based, exam room, on employer premises / sites, online, or proctoring. All of these methods will require different mechanisms to ensure security, accessibility, appropriate question selection, preventing any element of predictability and other such considerations. You may already have delivery systems in place, but do you have mechanisms to check that the system(s) you have will be fit for the assessment method it is to be used for? If you are using third parties for systems or delivery sites, have you got clear contract terms and/or service level agreements for their use, and clear operation / access instructions and methods of review?

4, Set the Boundaries and Parameters at the Start of the Design Process

It may sound like an obvious thing to say, but your experts will all need to be working consistently. Consistency in design and development will help to ensure consistency in delivery.  Please don’t just issue your designers/developers with a copy of the standard and assessment plan, you must brief them, train them and issue clear instructions and guidance. You should cover what you are asking them to do, the timeframes for the work, the grammar and language to be used in design, designing to the level and gradings of the standard, the use of reference materials and so on.  Don’t forget, and it may sound obvious, but if they are designing questions, they also need to detailing the expected / required answer / response – I have seen examples of experts designing interview questions but not detailing what they expect to hear as a response.  I have written a range of articles on levels and question writing on FE News which readers may find helpful.

5, Consider Issuing Templates to Get the Design Process Started

I have often found that the design /development process has run smoother when initial templates have been produced with context, a structure, and ideas for each assessment method. This may not work for everyone, but I have found that it gives the designers a starting point and confidence in what they are doing. For example, I have used the standard and assessment plan to produce sample assessment templates for each assessment method, mapping in the appropriate KSBs and scores. This has enabled the designers to see exactly what they are aiming for and to be most effective in their design. The more recent assessment plans have a lot of this in place already, but I have always found that the use of templates helps to provide structure, clarity, and bring consistency.

6, Be Prepared for Disagreements

When working with a range of experts, they will invariably have differing interpretations and views on elements of the assessment or the KSBs.  I have attended many design meetings with disagreements between experts, but these have always been positive, despite the challenges, as it gets all parties to think and debate to come up with a solution. It has also been a brilliant way of ironing out the levels and grading for assessment questions/activities, and identifying any risks that may crop up during delivery.

7, Ensure Accessibility

With such an array of assessment and delivery methods, it is absolutely critical that you embed accessibility in your design and development. This can range from ensuring that the language used is free of geographical colloquialisms, gender, ethnic, age, political, cultural stereotyping or discrimination, checking that a question can be viewed and read in all formats (paper, tablet, computer, mobile phone), to guidance for assessors on how to assess without bias. But don’t forget it is also about the delivery premises. Given the wide variety of assessment methods, you must be sure that the locations and equipment are accessible to all types of learners. When looking at equipment, you also need to be mindful of brands – if a learner has learnt their trade on a specific brand of equipment, should the same brand be used in assessment? some would say yes for fairness, others may say no as they should be able to use all types of equipment in their trade, I will leave that debate to you.

8, Review, Review, Review

Some call this critical evaluation, but I think that this is an overly negative term to use. In essence, what I mean is that you must review everything that is being designed and developed to make sure it is fit for purpose. For example, has what’s been designed map to the KSBs and meet the latest industry regulations? You will also need to review to check that what has been designed isn’t at risk of differing interpretations, that is it precise and based on facts not opinions or assumptions (if assumptions are used, they must be made clear), that it doesn’t directly or indirectly bring in bias either in the way the question is asked or the way an assessor may assess it, and that the language and grammar is accurate and appropriate. I have often found that a checklist template helps to focus the review process and ensure that nothing is missed. It is important that these reviews are undertaken by people with expertise and not just treated as administrative checks.

9, Identify Methods to Test

My biggest bug bear here is that people get carried away with academic theorems.  I am a person that focusses on plain English and practical application. Therefore, my suggestion would to look at a range of models of testing, but only use what: (a) is appropriate to the assessment methods, (b) is not overly complex, (c) you can deliver. Never say you will do something that you won’t or can’t end up doing. You may, for example, want to run the same test with 2 groups and compare results per question, or run a mock test with 2 assessors and compare assessment judgements. I am not saying that you have to test everything you design / develop, what I am saying is that you must have a mechanism to ensure that what you have designed is fit for purpose, and running tests, is one method of achieving that.

10, Proof Read

Even I admit to being guilty of this. When you have worked so hard on something for so long, your brain can miss errors when proof reading what you have produced.  Using a spell checker is not enough, my common errors are ‘of’ / ‘or’, ‘now’ / ‘not, errors which a spell checker will not pick up. Wherever possible, subject to confidentiality, use colleagues who have not written a particular document to proof read it. It is amazing what errors can be spotted.

And on a final note – always remember to have clear guidance / instructions accompanying everything that you have designed and developed, so that it can be operated and delivered effectively and consistently.

I hope that these are helpful hints and tips, You can access all of my articles for EpAOs on FE News, and weekly updates on all things end-point assessment via my LinkedIn page.


Related Articles

Responses