From education to employment

AI Isn’t the Enemy of Critical Thinking, It Might Be the Best Way to Teach It

Sam Holland Exclusive

As generative AI tools become embedded in education and training, we must stop asking if they replace thinking and start asking how they can sharpen it.

Is AI Really Killing Critical Thinking?

One of the common fears surrounding AI, particularly in education, is that it is eroding our critical thinking. If we can simply pose a question to ChatGPT and it can instantly reply with an essay that is well-researched, sourced and reasoned, then what is the point of thinking for ourselves?

This might be a hot take, but I believe the opposite is true.

I also find it a little unbelievable that people are out here genuinely asking the question “What is the point of thinking?” But here we are.

I believe that as generative AI tools become more and more ubiquitous, we as a culture will wake up to the fact that critical thinking is an essential skill for humans in the world that we are walking into.

The problem here is that we’re still considering this question in an old-world context. We are still considering thought and ideas as proprietary, and distinctly of human value. But when ChatGPT can spit out infinite ideas based on a single prompt, then this assumption gets harder to defend. So, where now does human value come from?

It’s uncomfortable to consider, but the AI genie is out of the bottle at this point, and in order to truly understand the effects of this new technology, we need to consider it in a post-genie world.

Living With the AI Genies

Am I talking about the death of the essay? Partly, I suppose. But what I mean by a post-genie world goes way beyond that. AI will continue to permeate every element of our society and our day to day lives. Consulting with AI on tasks is becoming the norm in workplaces, and for better or worse, education is not immune to that.

As educators, it is our job to make sure that it is for the better.

Now, I like an essay. I like writing them, I like reading them, I like planning, editing and polishing them. I love when I can feel my arguments coming together to tell a satisfying story, to make a compelling argument, or to articulate a complex concept. I even love arranging my references, and compiling the sources. Though, I’ll admit I might be an island in that respect!

For me at least, none of this is negated by the use of generative AI.

Consider how products are created in the real world. Writers have editors, writing circles, researchers, publishers. All of these people work with the writer to hone and polish their work, and to ensure that it is the best product it can be before it reaches an audience.

Does the influence of these external contributors make the end product any less the writer’s work?

I think both sides can (and will) be argued. But for me, that’s the wrong question.

The right question is “Does that make the end product better or worse?”

For learners, that end product should be a demonstration of learning, skills development, comprehension and engagement. Learners don’t have editors or researchers. But they do have AI to help them in these areas now. So, the question we should be asking, and the real challenge for educators in this world of AI genies is ultimately the same as its always been:

“How can we ensure that the use of technology improves rather than hinders the learner experience in these core outcomes?”

Why Critical Thinking Matters Now More Than Ever!

When I am working alongside AI in my work, I use it to brainstorm, to bounce ideas, to refine concepts, to research, to make mistakes, to explore different viewpoints, to help me articulate something more effectively.

The core ideas are still my own, even if they are articulated or even transformed by working with generative AI. As AI can help me get my point across more effectively, then that helps me to understand how to communicate better. It helps me to see gaps in my understanding, or flaws in my logic. It ensures that my end product is well-considered from multiple angles, and honed to be the best it can be.

While my AI assistant may appear endlessly more knowledgeable than I am, it still lacks that real-world context that I bring. It lacks the understanding of where I am coming from and what I am trying to do. It lacks my lived experience, which cannot be simulated. We both have gaps in our knowledge and understanding, and our collaboration makes for a richer experience, and a vastly improved outcome.

So, working collaboratively with AI helps me to improve my end product. But only because I am thinking critically when I use it.

I’ve often described AI as like working with the best apprentice you’ve ever had. They have an incredible wealth of knowledge and charisma. They are super eager to please (often to a fault). But they do not always have a good grasp of how to apply their skills, knowledge and charm, in useful ways, in a real-world context. And as a result, quite often, they will boldly and confidently talk themselves into the wrong solution.

This is what we are for, as humans. And this is why critical thinking is vital when working with AI.

We have to assess the outputs of the AI. We have to coach it into working with us effectively. We have to provide it with useful, actionable feedback and constructive criticism. Which, by the way, are some of the core skills of our tutors and assessors already. So really, who is better placed than an educator to truly get the most out of working with these tools?

And I strongly suspect that anyone who is truly working effectively with generative AI tools regularly will agree that their critical thinking skills are actually enhanced by interacting with them.

To my mind, this is the most important thing that we can teach our learners: Working with AI requires high levels of critical thinking. But also, working with AI, appropriately and mindfully, is a fantastic way of building those critical thinking skills.

This emerging world of rampant AI genies doesn’t need less thinking. It needs better questions.

So what questions are you asking?

By Sam Holland, a digital learning specialist working in work-based learning (WBL) in South Wales, with a focus on embedding emerging technologies in vocational training environments.


Related Articles

Responses