“If you want to keep your teeth, make your own sandwiches.”
– Jeannette Winterson, Oranges Are Not the Only Fruit
I think a lot about how average folks (of which I count myself one) use and will use AI. When it comes to actual AI use – as opposed to just playing with AI – I use it to do things AI can do more quickly than I can or to do things I can’t do and either a) don’t want to invest the time to learn how to do or b) have tried to learn how to do and quite honestly am bad at. (To wit – artistic stuff.)
And again – to be clear – I use AI to make my life easier.
And whether we like it or not, in school, students and teachers want their lives to be easier too – but are we thinking enough about what “easier” gets us?
Again, I think, for the vast majority of folks, we’re looking for AI to do two things:
- Do it faster than I can.
- Do it better than I can.
These are product-oriented use-cases, not process-oriented use-cases. And AI is much better at product than process. There’s a reason we call it “Generative AI” — it generates product. And as such, it’s too easy to let AI do the bulk of the work, and then edit on the margins. It’s why college kids are asking why they should write the essay when AI can do it for them. It’s why computer science engineers are seeing jobs dry up because AI can write the code so much faster than they can. And it’s why there are a dozen new products being sold to schools that offer to use AI to give kids more feedback on their work than a high school teacher with 120 kids on their roster could ever hope to do.
But those use-cases have nothing to do with actual learning. Student learning happens when we struggle, when we grapple, and when the process of creating that artifact of our learning leads us to learn new skills, gain new insights, and synthesize new ideas. And teachers learn about their students when they take the time to thoughtfully read and critique their students’ work.
And AI should not do that for us.
And if we’re not careful, we’ll end up with student work co-created by generative AI and teacher feedback co-assessed by generative AI. And in the rush to create and assess the product, we’ll miss the learning. I don’t want ever want to be in a classroom where a student asks a teacher, “What did you think of my work,” and the teacher has to check to see what AI told the student before answering.
Does that mean there’s no use for AI for most of us? Probably not. If you ask Claude (my preferred AI) to give you feedback “like a writing teacher would,” it will critique your writing, rather than write for you. (Turns out, it doesn’t like parenthetical asides… too bad, Claude.) And maybe it’s ok if students take their first draft and throw it into AI and ask it to proofread it, and edit accordingly. I really don’t know — it’s the Grammarly question — when does proofreading become re-writing and when does re-writing change meaning?
But what I know is this: Deeper learning happens through the process of creating the product, not simply from the product itself.
And more and more, we’re going to have to ask ourselves – what is the thing we want kids to learn? When I ask students to do a piece of writing, I want to see their thinking. I don’t want to see what AI thinks of their thinking, and I don’t want AI to think for them. And the act of writing makes us better thinkers. The act of writing forces us to grapple with our thoughts in ways that clarify our thinking. When I blog, I’m thinking out loud. That matters — definitely to me, and hopefully to the folks who read what I write too.
And I want it to matter to students too. How they process and synthesize ideas matters. It’s how they are going to grow.
So yes, can I imagine a process by which students hand in a first draft that is completely non-AI assisted, and then hand in an AI-proofread or AI-assisted piece of writing and then write a reflection that talks about what that process meant for their writing? Sure… And will we probably have to figure out ways to help kids use AI responsibly such that they can see how an adaptive process can help them learn while improving the product? We will.
But I can also imagine plenty of pieces of work that are never touched by AI, because we want to see that learning process when it’s just the student and their ideas and skills and their ability to demonstrate them, write them, communicate them all on their own.
Because we’re going to have to remember that – at the heart of everything we ask kids to do – we’re asking them to learn. We’re asking them to figure it out. We’re asking them to grapple with ideas and concepts, even when — especially when — it gets hard. And while Generative AI can do things at speed, we should never let it take the place of our thinking and our learning.
If we want our students to keep their teeth, they’re going to have to make their own sandwiches.
Postscript:
As a thought-experiment, I asked Claude.ai to do the following:
Write a 750 word blog post about the dangers of using AI in the classroom. Focus on how AI can make it too easy to bypass the critical thinking and process-orientation that happens when students have to struggle with ideas. Use the Jeannette Winterson quote “If you want to keep your teeth, make your own sandwiches” as a framing lens.
And then, when the first draft seemed too clinical compared to how I write, I said, “Write it in the style of Chris Lehmann.”
Here’s how Claude explained the changes:
I’ve rewritten the blog post in Chris Lehmann’s distinctive style. The revision captures his:
- Conversational, direct tone – “Here’s the thing…” and “Let me be clear…”
- Personal reflection – “I’ve been thinking about this quote a lot lately”
- Progressive education philosophy – References to moving beyond factory models and empowering students as creators
- Specific institutional context – Mention of Science Leadership Academy
- Balanced technology perspective – “I’m not anti-technology. I’ve never been that guy”
- Urgent, passionate advocacy – “This is pedagogical malpractice” and “what really keeps me up at night”
- Focus on student agency – Emphasis on students as creators, collaborators, and critical thinkers
- Call to action – Direct challenge to educators in the conclusion
The piece maintains the Winterson quote as a framing device while incorporating Lehmann’s characteristic blend of philosophical depth, practical classroom experience, and passionate advocacy for student-centered learning.
(Side note: It’s a little freaky that I’ve been writing long enough that Claude can try to mimic my style.)
Here’s how Claude “thinks” I’d write this piece. It’s not a terrible piece of writing. It summarizes several of the major critiques of AI in schools that are out there right now, and it does “sound” like something I might have written. (Although, it seemed to me like there are sections that read like a parody of my writing style.) But I didn’t write it, and I didn’t think it. I came up with a short prompt and a style to write the piece in.
And even if someone thinks that the Claude-created piece is a better piece of writing than my flawed, human piece, I prefer mine. And I liked struggling through writing it. It made me just a little smarter for the act of writing it.
Let’s keep making our own sandwiches.
Discover more from Practical Theory
Subscribe to get the latest posts sent to your email.
The good news is that we’ve been stuck waiting around for something to trigger a reset to first principles in US education, which has been trapped in the local maximum where ed reform left us off. We’ve collectively gotten an early look at what happens in the decades after you destroy a practical academic field (the study of education). We’re starting down that path in medicine, scientific research, etc. now as well.
At the risk of sounding like some kind of accelerationist, fundamental questions about what school is for are becoming less escapable again. One thing I get thrown by right now is the way that the writing machines are able to jump in throughout a project’s workflow. Progressive educators deeply value “the having of wonderful ideas,” but brainstorming is one of the things LLMs are best at.
I don’t see any way around a huge return to orality in schools. In my recent experience even absent AI that has become badly needed (and of course SLA always emphasized presentation).
(Some) progressive educators are going to have to get less squeamish about what knowledge people need to hold in their heads in the coming era where the first three sources at hand may all be malign, soulless propaganda machines.
To me, these are the lines to fall back to:
● Evidence: How do we know what we know, and what’s the evidence?
● Viewpoint: Could there be another point of view?
● Connections/Cause and Effect: Do you see any patterns? Has this happened before?
What are the possible consequences?
● Conjecture: Could it have been otherwise? If even just one thing had happened
differently, what might have changed?
● Relevance: Does it matter? Who cares?
Debbie Meier et al of course.
https://files.ascd.org/staticfiles/ascd/pdf/journals/ed_lead/el200905_meier.pdf
Thanks for this, Chris. I largely agree and there inlies the rub. The work of teaching and learning can’t be divorced from the conditions in which that process is happening. It ain’t just genAI, but also demands on students, educators, and schools that ultimately don’t allow for the liberation of learning we desire. But we’ve seen the onset of the problem for a long time now.