From education to employment

A Moment of Opportunity for FE and AI 

Danny Brett and David Jennings Exclusive

Artificial intelligence is now shaping almost every conversation about the future of education and work. Yet, despite the sector’s agility and deep connections with employers and communities, Further Education (FE) is often left on the margins of national AI policy and investment. 

The Gatsby-funded research Generative AI in Further Education set out to change that narrative. Through a series of workshops with principals, teachers, professional services teams, and policy makers such as the Department for Education, Skills England, Innovate UK, and Jisc, the project explored how colleges are already experimenting with generative AI (GenAI), and what support they need to scale its use responsibly. 

What emerged is a picture of quiet innovation and growing curiosity. FE professionals are not waiting for permission to engage with AI; they are finding ways to apply it to enrolment systems, teaching and learning, and strategic planning. But they are doing so without a shared framework, without consistent training, and without the national recognition that FE’s role in digital transformation deserves. 

This article reflects on two connected dimensions of that work. Danny Brett explores the policy and system conditions that could enable FE to lead the responsible adoption of AI across skills and workforce development. David Jennings focuses on how practical tools such as the Vocational Education Scenario Builder can empower staff to explore, test and shape AI futures from the ground up. Together, these perspectives show how FE can move from isolated pilots to a confident, coordinated movement. 

Why FE matters in the national AI conversation 

FE colleges sit at the intersection of learning, work and innovation. They train the technicians, engineers, digital specialists and care professionals who will both use and be shaped by AI. That gives FE a unique vantage point, and a unique responsibility, to lead the national conversation on responsible adoption. 

Yet current policy frameworks do not fully reflect that potential. Government strategies on AI in education have largely focused on schools and higher education.  

The Generative AI in FE research shows why this needs to change. FE has three distinctive advantages that make it central to the UK’s AI future: 

  1. Agility and applied practice. 
    Colleges have repeatedly shown their ability to respond quickly to technological and economic change. Many are already using GenAI to improve enrolment analytics, streamline administrative tasks and generate learning resources. What they lack is structured time, funding and coordination to take these early pilots further. 
  1. Direct links to employers and SMEs. 
    FE’s relationship with local businesses, particularly small and medium-sized enterprises, means it is uniquely positioned to act as a bridge between AI research, workforce need and regional productivity. Through programmes such as the Innovate UK Further Education Innovation Fund, colleges are already driving AI-enabled projects that connect classroom learning with workplace transformation. 
  1. A workforce ready to experiment. 
    The research found strong enthusiasm among leaders and staff to test AI in realistic, locally grounded ways. Teachers are exploring GenAI for lesson planning, feedback and assessment. Professional services teams see opportunities for data-driven student support and employer engagement. What is missing is sector-wide support to share learning, build confidence and address ethical or regulatory uncertainty. 

To unlock this potential, we need to shift from isolated experimentation to system-level enablement. That means: 

  • Investment in capability, not just technology. Colleges need time and resources for professional development that help staff use AI critically and creatively, not just efficiently. 
  • Clear policy signals. FE leaders repeatedly told us they want “permission to experiment”, assurance that thoughtful pilots are encouraged, even when outcomes are uncertain. 
  • Joined-up national support. There is an opportunity for sector wide organisations and bodies with deep sector relationships to help convene or amplify a community of practice. Their existing reach, infrastructure and experience mean they are well positioned to support shared learning, reduce duplication and connect emerging practice across the system. 

FE does not need to import AI solutions from other sectors; it can design its own. With the right conditions, coherent policy, targeted funding and collaborative learning, the sector can become the testing ground for responsible, inclusive AI adoption that benefits students, employers and communities alike. 

The need for collaboration is echoed by Danny Metters, Chief Executive and Principal, Bishop Burton College: 

“Further education has the agility and employer links to lead the way on AI adoption – but it cannot do this in isolation to have real impact and meaning. If FE is to realise its potential, it needs coordinated support, shared learning and investment to turn promising pilots into embedded practice.”  

Andrew Kaye, Principal and Chief Executive, South Hampshire College Group emphasised the importance of shared knowledge: 

“It was clear through the project research, that there is both need and demand for training and support in the use of AI, in both teaching and learning and in corporate functions. As the use of this technology develops, it will be good to see common best practice adopted across the sector.” 

The Vocational Education Scenario Builder – a practical way for FE to shape its own AI future 

One thing our work made clear is that FE staff do not need more generic talk about AI. They need safe, concrete ways to explore what AI might mean for their courses, their learners and their institutions – and to do that in a way that fits the realities of FE, not a hypothetical school or university.  

The Vocational Education Scenario Builder was created for exactly this purpose. It is a simple custom ‘GPT’ in ChatGPT – free to use for anyone with an account – built specifically for vocational and further education. Staff begin by telling it who they are (for example, curriculum lead, tutor, MIS manager, student services officer), what part of FE they work in, and a live challenge they want to think about. They then choose a time horizon (short, medium or longer term) and how radical they want to be. 

From there, the Scenario Builder generates short, tailored “postcards from the future”: sketches of how GenAI could change a particular piece of work in the next few years – and what that might mean for students, staff, systems and governance. These scenarios are not prescriptions. The instructions are explicit that they are “pictures to help you think about the future” and may not be achievable or even desirable. Users are encouraged to react, critique and ask for alternative perspectives (for example, “what would this look like from a student’s point of view?” or “what are the ethical risks here?”). 

In the Gatsby workshops, principals, professional services teams and teaching staff all used the Scenario Builder with challenges rooted in their day-to-day work. Leaders explored college-wide AI strategies and governance. Professional services staff worked on scenarios for enrolment, CRM and student support. Teachers used it to probe the impact of GenAI on lesson planning, feedback, assessment and curriculum viability. The report’s examples show that when staff bring detailed local context into the conversation – staffing pressures, funding constraints, regional labour market trends, course mix – the resulting scenarios are more grounded, specific and usable. 

Crucially, the tool helps staff move from using GenAI in a narrow, “summative” way – quick answers, generic best practice – towards a more exploratory “generative” use, where AI is a thinking partner rather than an oracle. The dialogues in Appendix 4 of the report show staff iterating with the Scenario Builder: challenging its first attempts, adding more context, and pushing it for clearer examples, ethical analysis and different stakeholder lenses. This is exactly the kind of capability FE will need if AI is to support staff-led, locally anchored change. 

We believe the Scenario Builder will scale well beyond individual pilots. The report argues that what FE now needs is time and community: space for staff to share the scenarios they generate, the prompts that worked, the pitfalls they hit and the organisational questions that follow. A scenario-focused Community of Practice – supported by bodies such as Jisc, the Association of Colleges and Innovate UK – could use the tool as shared infrastructure: swapping prompt templates, adapting the Scenario Builder tool for different regions or subject areas, and building up a live library of FE-specific AI futures rather than starting from scratch in each college. 

Commenting on the potential of the Scenario Builder, Danny Metters said: 

“The Vocational Education Scenario Builder empowers educators and leaders to imagine practical futures for teaching and learning. By combining clarity of thought with the creativity of generative AI, it turns challenges into opportunities—helping vocational education evolve with confidence and purpose.”  

Andrew Kaye, Principal and Chief Executive, South Hampshire College Group said: 

“This research has come at a really important time; AI is such a hot topic, and the opportunities appear significant, but its use isn’t yet fully embedded in colleges. The scenario builder approach was a really effective way to test the use of AI in supporting strategic decision-making in our complex sector, particularly at a time of such significant policy change and socio-economic challenge.” 

A call to action 

If you work in an FE college, the invitation is simple: pick one real issue and try the Scenario Builder with it.

Your issue could be: 

  • a course that needs redesigning or is under recruitment pressure 
  • an admin or MIS process that is creaking under workload 
  • a question about how to guide learners’ own use of GenAI 
  • an early AI pilot you are unsure how to scale or govern. 

Use the tool to generate one or two scenarios and then discuss them with colleagues. What feels plausible? What feels risky? What would you want to preserve about current practice, and what might you be ready to change? Treat the scenarios as conversation-starters, not instructions. 

We also invite FE sector bodies, funders and agencies to back and convene this kind of work: support colleges to use and adapt tools like the Scenario Builder, create shared spaces where staff can exchange scenarios and learning, and align AI guidance and CPD offers with the generative, staff-led practice the report describes. 

The technology for FE to imagine its own AI futures already exists and is cheap to use. The next step is collective: using it to build confidence, shared judgement and a stronger voice for FE in the national AI conversation. 

Conclusion  

The FE sector is at an important crossroads in its relationship with AI. What this research shows is not a sector waiting to be told what to do, but one already taking thoughtful first steps, testing ideas, exploring possibilities and considering how AI can genuinely strengthen teaching, learning and organisational practice. 

To build on this momentum, we need space for safe experimentation, clearer guidance, and opportunities to learn from one another. Tools like the Scenario Builder offer a practical way for staff to explore AI in context, while wider policy and system conversations can help ensure that colleges have the support, confidence and infrastructure they need. 

By Danny Brett, Managing Director at ThinkMove and David Jennings, Managing Director at DJ Alchemi


Related Articles

Responses