From education to employment

AI in Education: A Human Approach

Ben Knight, Head of Language, Content and Pedagogy at Oxford University Press on FE News exclusive background

Ben Knight, Head of Language, Content and Pedagogy at Oxford University Press, reflects on the enthusiasm shown for AI in education, and how we can ensure that the vital human elements of education are preserved during this period of exciting technological innovation.

Educators’ Optimism and Concerns in the Era of AI in Education

A surprising aspect of the explosion of interest around AI in education is the enthusiasm from educators themselves. Teachers who have lived through various overhyped technology ‘revolutions’—language labs, MOOCs, Computer-Assisted Instruction, Badging, Blockchain, Google Glass—could be expected to roll their eyes at the noise around Generative AI. But in fact, there has been rapid recognition of the genuine impact that AI will have on teaching and learning—both on the pedagogy (the ‘how’) and the curriculum (the ‘what’). An OUP report on teachers’ response to AI found that more than 70% of teachers are optimistic about the role of AI in education.

As teachers, they can see how AI can help them prepare better lessons faster. They can use AI-driven tools that provide so much more information about each student in their class, identifying who is falling behind and in which areas, and who is probably getting bored because they’re ahead of the pack. They can see how AI could help them with the perennial problem of mixed ability classes, personalising learning activities to each individual.

For their students, they can see the goalposts are moving. Producing a passable essay without grammatical errors will no longer be an educational goal. The goal will be to develop in their students the skills to use Generative AI to produce outcomes—essays, analyses, designs, experiments, presentations—which are better than those ‘passable’ outcomes from unskilled users. We are moving from ‘What are the causes of global warming?’ to ‘Improve the accuracy of this report on the causes of global warming’.

But there is also a growing awareness of the risks around AI. Education is an intensely human activity—developing the minds and lives of children and young adults. How can we ensure that AI is supporting our human goals and ways of thinking? How can we protect ourselves and students from the risks to privacy and security, of increasing stress levels, and the widening educational gap between the poor and the privileged?

Fostering Human-Centered AI in Education

A Human-Centred AI (HCAI) approach aims to address these concerns. A leading proponent and author, Ben Shneiderman, sees the challenge to be creating a future with computing devices that dramatically amplify human abilities, empowering people and ensuring human control. HCAI requires clarity about human needs and human control.

We can start by thinking about where educators currently struggle to meet needs and where AI could help. A few examples might include help with managing a class with mixed abilities and interests; tailoring classroom activities to make them more relevant to their students; and giving instant personalised feedback to students on their assignments.

Some necessary administration tasks are very doable but take up valuable time which could be better spent on teaching—meaning that AI could assist with attendance checks, sending reminders to students when homework is due, and creating reports for parents, managers, and students.

The other dimension to HCAI is human control. Shneiderman argues that we are misled by journalistic and sci-fi tropes into seeing AI as autonomous intelligent agents. While this is not completely baseless, it doesn’t reflect the reality of AI in most cases. We should instead see AI as a ‘supertool’: something humans have designed to meet our needs, and under our control. Building ‘human control’ into AI requires us to think about where automation is needed, where human control is needed, and we need combinations of both.

In education, we need options for human control when decisions need to consider qualitative or emotional dimensions that are difficult for AI systems to evaluate reliably. Teachers know that the way they give negative feedback to one student may not work well with another. They know that an unexpected answer from a student may reveal an imaginative response that should be encouraged, not rejected. AI-based auto-marking of homework essays is great for fast individualised feedback, but its accuracy may not be reliable enough for a grade that determines a young person’s future. The higher the stakes, the more responsibility we have to ensure human control.

We need to be building a basic AI literacy among teachers and educational leaders. An essential requirement for an HCAI approach in education is that teachers and other educationalists are engaging actively in the development and use of AI. The case of AltSchool is an illustration of what can go wrong when an educational enterprise is driven by technologists and venture capitalists—but teacher engagement requires them to have some understanding of and confidence with AI. At the heart of an AI literacy programme, there are three key areas of competence: conceptual competence – a basic understanding of what AI is and how it works; evaluative competence – a basic understanding of the opportunities and risks around AI in education; and operational competence – basic skills in identifying, accessing, and using common AI tools relevant to education. There are some great initiatives to support this – such as Harvard University’s AI Pedagogy Project and the UK’s cross-sector AI in Education initiative.

By Ben Knight, Head of Language, Content and Pedagogy at Oxford University Press


FE News on the go

Welcome to FE News on the Go, the podcast that delivers exclusive articles from the world of further education straight to your ears.

We are experimenting with Artificial Intelligence to make our exclusive articles even more accessible while also automating the process for our team of project managers.

In each episode, our thought leaders and sector influencers will delve into the most pressing issues facing the FE.


Related Articles

Responses