Why might ChatGPT damage learning?

This is not another moan about the perils of ChatGPT, or teeth-gnashing about the inexorable takeover of AI. It is a quick expression of a genuine concern about the potential losses to learning that could attend useful tools like ChatGPT if it becomes a classroom mainstay.

I suspect that ChatGPT, and the AI revolution, like most ground-breaking tech innovations, will change lots of things. But we will quickly assimilate those changes, adapting AI to suit our long-standing needs. The likes of education may well not be as dramatically different as we may hope or worry about.

Despite being confident in how we can adapt and use ChatGPT and AI for our betterment, I am worried – from an educator’s perspective – that it could be significantly damaging to learning.

How might ChatGPT hamper learning?

Learning to read and write, or do algebra problems, is cognitively demanding for pupils of all ages. Reading to learn about the great religions, or the Norman Conquest, is without a doubt a tricky, effortful act. And let’s be honest, the average teenager would be happy with a shortcut.

There is an imagined future where AI does all the hard work for our pupils and university students. The International Baccalaureate has already accepted Chat GPT can be used by pupils if they reference that help accurately. Maybe it will even become integral to schooling and even trump the Internet?

AI tools will be able to write thoughtful, creative stories, or generate balanced arguments about the impact of the Norman Conquest at the click of a button.

In short, it may do a lot of the hard thinking for us.

And therein lies the problem.

Learning happens when we think hard.

Pupils needs to grapple with new words – write them down, struggle with them, look them up, use them badly, get feedback, and then use them better. They need to write flawed stories filled with errors and clumsy expression, thereby learning how to edit and adapt their writing, so that they ultimately find their own unique voice. They need to grapple with selecting evidence from a variety of sources to compose a decent essay on the impact of the Normans.

More than the Internet ever could, AI tools like ChatGPT can do all these things. But it will shortcut learning if we allow that in the classroom. The notion of ‘Just Google It!’ is a long-standing misnomer. It misunderstands that the effortful act of learning, and how we gradually build rich storehouse of cognitive connections through effort, failures, and success.

If we replace extended reading of tricky texts with short, accurate AI answers, we could hamper learning.

If we replace making notes, planning, drafting, editing, and revising a history essay, with speedily generated AI answers, we could hamper learning.

If we stop handwriting notes, and instead get ChatGPT to generate those notes for us, we could hamper learning.

Our problem may be that we are confusing learning with the final end-product, and not the effortful thinking that was necessary to create it, which is where learning happens.

I use ChatGPT – it is helpful in a variety of ways. I am no Luddite. But I worry that if we are not careful, it may get overused in education and go on to damage our ability to learn.

4 thoughts on “Why might ChatGPT damage learning?”

  1. “If we are not careful” is the problem, not ChatGPT or anything else. You could easily say the same thing about Wikipedia, or calculators, or even literature itself – if all students have to do is dump from the tool they use, then at best learning is diminished. In fact, it’s likely that what students are learning is that their ideas and thinking don’t matter; the important thing is getting the Right Answer.

    But that same problem has existed for generations. This is just a new leaf on an old tree.

  2. Maybe we need to instead worry about teaching students HOW to use the tools and evaluate what they produce, rather than worry that they won’t know some knowledge. Perhaps the skills of the future are not the same critical thinking skills we have and need to teach now.

    1. Alex Quigley

      I actually think we should teach students how to use these tools, but sometimes we should make the decision to remove the tool, especially given tasks where we want students to develop their understanding, as they will need to think hard without the AI tool to hand.

      We know enough about the brain to know that even if we teach them HOW, if we place the temptation of easy answers instead of hard thinking in front of a teenager, we shouldn’t be surprised if they misuse the AI tool regardless of our instruction.

  3. Alex, for me, your post and perspective gets to the heart of what subject learning SHOULD be all about: encouraging the process of thinking in students. We should be emphasising the process of working kids’ reasoning skills and encouraging intellectual curiosity for them to think about their thinking. endorsed use of language models like Chat take away the opportunity for fruitful cognitive struggle and their meta learning, not to mention the worrying message kids might pick up about self-efficacy in learning.

Leave a Reply


Get the latest posts delivered to your mailbox:

Scroll to Top