The Schools White Paper Just Showed Us Where AI Policy Is Heading. FE Needs to Pay Attention
Two weeks ago, Anthropic updated Claude and share prices tumbled across financial services. A single model update from one AI lab sent a tremor through an entire industry. That is the speed at which the ground is shifting.
Against that backdrop, the UK government has just published its schools white paper, Every Child Achieving and Thriving. And buried inside a document that the media has covered almost entirely through the lens of SEND reform and trust structures is something that should matter enormously to everyone in further education: the most explicit AI education policy the UK has yet produced.
What FE leaders need to know
The white paper commits £23 million to a four-year EdTech evidence programme, announces sovereign AI safety and pedagogy benchmarks, launches an AI Safety and Pedagogy Taskforce bringing together teachers and AI experts, and promises teacher co-created AI tutoring tools available to schools by 2027. It names Google DeepMind and OpenAI as partners. It commits to digitising the National Curriculum as infrastructure for the entire EdTech sector.
Most significantly, it draws a line that the research community has been arguing for years: general-purpose AI in education often has negative outcomes, because it gives answers rather than building understanding. Purpose-built educational AI, rooted in pedagogy, shows promise. That distinction is now government policy.
This is a school’s white paper. But FE leaders who treat it as someone else’s business are making a mistake.
Why this matters beyond schools
The cohorts arriving at college from the late 2020s will have been taught with AI tools, within a digitised curriculum, by teachers trained through a new digital skills pathway. FE providers will need to build on that foundation, not repeat it. If your provision is not ready for learners who arrive with a different relationship to technology, you will be playing catch-up from day one.
More immediately, the AI tutoring tools being designed for secondary maths and English have obvious relevance to resit provision. The white paper creates new Level 1 qualifications for 16 to 19 year olds alongside 100 hours of English and maths for those without a grade 4. If the AI tools work, the question of whether they extend into FE is inevitable. The paper does not address this. It should, and FE leaders should be asking.
The bigger challenge: preparing people for work that is changing now
But the white paper, for all its welcome specificity on AI, is still primarily about schools. The questions that matter most for the FE audience are bigger.
Entry-level roles are being reshaped faster than our skills systems can respond. When a single AI model update can shake an entire sector’s stock price, we need to ask what that means for the apprentices, trainees and career-changers we are preparing. The skills they need are not static. The notion that we can train someone once and they are set for a career is already outdated. Continuous learning is not a slogan; it is an operational requirement.
McKinsey’s 2025 Global AI Survey found that 88 per cent of organisations now use AI, but only 6 per cent generate meaningful returns. BCG’s research shows that 70 per cent of AI implementation challenges are people and process related. The technology is not the constraint. We are. The scarce capability is not knowing how to use an AI tool. It is knowing how to help organisations transform around AI: the governance, the change management, the critical judgement about when AI adds value and when it does not.
That is a skills gap that sits squarely in FE’s territory. And it is not being filled.
The management revolution nobody is talking about
Consider what is coming. Managers will increasingly need to oversee not just human teams but hybrid teams of people and AI agents. The human-to-agent ratio in many organisations is going to shift dramatically over the next few years. That requires a fundamentally different set of management skills: understanding what AI can and cannot do, knowing when to trust its output and when to override it, designing workflows that play to human and machine strengths respectively, and governing data use responsibly.
Where are these management skills being taught? Not at scale. Not systematically. And not, for the most part, in FE, despite the fact that FE is closer to the world of work than any other part of the education system.
AI literacy, AI governance and entry-level digital skills are not optional additions to the curriculum. They are becoming foundational. Yet Gallup reports that more than two-thirds of US teachers received no AI training last year, and the UK picture is unlikely to be dramatically better. We cannot teach what we do not understand ourselves.
What we can learn from elsewhere
Singapore has embedded AI literacy across its education system and directly linked it to workforce development. Its National AI Strategy 2.0 explicitly connects what happens in the classroom to what employers need. Qatar is investing heavily in AI-enabled education as part of its economic diversification. Hong Kong has published a detailed AI education blueprint that goes beyond schools into lifelong learning. These are not perfect models, but they share something the UK currently lacks: coherence between education policy and skills policy on AI.
The UK now has a schools white paper that takes AI seriously and a post-16 skills white paper that largely does not. That gap is a problem. FE sits in the gap.
Three things FE leaders should do now
First, invest in your own people’s AI understanding. Not tool training. Understanding. Your staff need to know what AI actually is, how it works, why it fails, and when to trust it. This is the foundation for everything else. Without it, every other AI initiative is built on sand.
Second, audit your curriculum for the AI-shaped world of work. Are your apprenticeship programmes, your vocational qualifications, your adult learning offers preparing people for a labour market in which AI agents are colleagues, not curiosities? If you are teaching someone to be an administrator, a paralegal, a junior accountant or a marketing assistant, what does that role look like in three years? Your employer partners are asking themselves the same question. They need you to be part of the answer.
Third, make your voice heard on policy. The schools white paper has set a marker on AI in education. The post-16 equivalent has not matched it. The 12-week consultation on SEND reforms is open, but the broader questions about AI in FE need advocacy too. Where are the explicit AI literacy requirements across FE qualifications? Where is the practitioner preparation strategy? Where are the evaluation frameworks for AI tools before they enter learning pathways? These questions were not answered in the skills white paper. They need answering.
The ground is shifting
The schools white paper tells us that the government understands AI matters in education. It has committed real money, set specific timelines, and drawn a critical distinction between AI that helps learning and AI that undermines it. That is progress.
But further education is where the skills system meets the labour market. It is where people retrain, reskill and prepare for work. If AI is reshaping that work, and it is, then FE cannot afford to wait for a policy framework to arrive. The schools white paper has shown us the direction of travel. FE needs to get ahead of it.
Professor Rose Luckin is Professor of Learner Centred Design at UCL Institute of Education and founder of Educate Ventures Research. She publishes The Skinny on AI for Education and is the author of Machine Learning and Human Intelligence and AI for School Teachers.
Responses