LSDA Research Highlights Fault in “Unending Treadmill of Formal Assessment”
“
A report from the Learning Skills Development Agency (LSDA) has criticised the “unending treadmill of formal assessments” in the English education system.
The report, which is the first of its kind to encompass all types of post-16 education and training, including school sixth forms, further education colleges, workplace training and adult education, was carried out by Manchester Metropolitan University on the LSDA’s behalf.
The aim of the report, which was based on interviews with 237 students, was to assess the usefulness of various methods of assessment, and their effect on student performance. To achieve this, the experiences of learners in different settings were compared and contrasted to identify the methods of assessment that best enabled students to succeed in specific sectors and environments.
Traditional Approach Losing Appeal
The research showed that traditional examinations and written work were unpopular with many students. Students in the learning and skills sector favoured practical tests, project work and online testing, whilst A-Level students were found to prefer written assignments and exams. Those taking National Vocational Qualifications (NVQs) preferred practical assessments. Crucially, the results showed that students wanted to be assessed in appropriate ways, and allowed to demonstrate their skills. In practical subjects, for example, the report found that a lot of emphasis is still being placed on writing, rather than gaining practical experience. The report identified this problem area as a need for “fitness for purpose.”
Andrew Thomson, Chief Executive-designate of the Quality Improvement Agency for Lifelong Learning (QIA) agreed that the report raised some important issues surrounding the assessment of students. “The whole point of education must be to help people to succeed,” he said. “So if offering students a choice on how they want to be assessed helps them to progress, then this needs further investigation. We are good at testing people’s knowledge, understanding and ability to write cogently, but arguably not so good at testing practical competence or different forms of intelligence. The issue is whether learning has taken place and if the student is being motivated to achieve.”
Box Ticking
Researchers also spoke to 95 assessors, including staff at awarding bodies, teachers, employers, supervisors and college heads of departments, some of whom raised concerns about the problems inherent in multiple-choice tests and practical assessment, which they warned could bypass the chance to get direct evidence of a student’s skill level. Furthermore, it was argued that the authentic working situations necessary for practical assessment were often hard to recreate. Multiple choice tests, though popular with many students, give little opportunity to test literacy, and can reduce the learning process to “box ticking”.
The report also confirmed that a trend amongst awarding bodies to provide greater support and feedback to students has had a direct impact on their performance. Giving students increasingly close support from teaching staff and more guidance on how to take exams has been shown to have a positive effect on success rates. However, this process has been described by some as “dumbing down” the education system, and the report therefore suggests that more research and discussion is needed in this area before definite changes can be made.
Challenge of Learning
Harry Torrance, the Head Researcher from the Manchester Metropolitan University team that carried out the research, said that the report had identified a number of important issues about the purpose of education and assessment. “The clearer the task of how to achieve a grade or award becomes, and the more detailed the assistance given by tutors, supervisors and assessors, the more likely are candidates to succeed,” he said.
“But succeed at what? We are in danger of expelling the challenge of learning.” He expressed concern about the movement of towards education goals based on qualifications, and warned about the risk of: “assessment as learning”, where preparation for assessment and exams dominates the learning process. “The danger is that all they come to understand is how to comply with assessment criteria, rather than the bigger picture and what their programme of study or training is really all about. We are not arguing against the importance of qualifications ““ far from it. But we are arguing for an assessment regime that is “fit for purpose” and supports learning rather than replaces it,” he said.
Jess Brammar
Keep up to date with the new QIA Agency right here at FE News!
“
Responses