They also reported, in the very same survey, that they are using AI for homework more than ever before, with usage climbing from 48 per cent to 62 per cent in barely seven months. The students, in other words, can see the problem clearly. They simply cannot stop participating in it.
This suggests that what students using AI lack isn’t critical thinking skills. If they didn’t have those at all they wouldn’t be able to see the problem. This suggests that what they actually lack is self control and the discipline to quit using AI for their homework.
This is not cognitive dissonance in any simple sense. It is something more structurally interesting: students have correctly diagnosed a systemic problem, but they exist within a system that gives them no rational incentive to behave differently.
The incentive would be that they gain the skills they are supposed to gain by doing the school work. School work pretty much has always been designed to be a framework of building blocks designed to build skills. Skills like the ability to retain information, the ability to recall information learned and put it together with other new information, the skills to build info on info to go from understanding the barest concepts to understanding the technical details. Those skills are what bring reading and writing, math, science, etc together so we can go from 1+1=2 to πR^2, and understand the real world applications.
While I can get behind the idea that teaching critical thinking has been minimized to its most basic form in schools in order to facilitate better test scores for decades, I don’t think it’s fair the say that school don’t teach critical thinking skills. I think perhaps it’s fairer to day that schools don’t prioritize teaching them, relying on people’s ability to observe and Intuit and come to conclusions which is exactly what critical thinking skills are.
I do agree that Generative AI LLM’S are exacerbating the problems that were already there in this system. I even agree that administration at pretty much all levels of teaching is failing to mitigate the problems with Generative AI LLM’S.
But students wouldn’t know this without critical thinking skills:
Assignments are graded. Grades determine university admissions. University admissions determine (or are perceived to determine) life outcomes. If your peers are using AI and getting better grades, opting out is not a principled stand.
You didn’t need “AI” to do this.
Well, (some) people’s willingness to believe anything an LLM tells them does speed up the process.
That Khanmigo AI mentioned in article seems to be an interesting thing to try and maybe actually recommend to students (they’re gonna use AI anyway, so why not recommend something that gives hints instead of confident bs).
The No Child Left Behind Act of 2001 (NCLB) represented, in the United States at least, the triumph of measurable outcomes over meaningful learning. Under its regime, schools were judged by their students’ performance on standardised assessments. The consequences of poor scores were severe: funding cuts, staff dismissals, school closures. The entirely predictable result was what educators came to call “teaching to the test,” a practice in which classroom instruction was narrowed to the specific content and formats that would appear on state exams.
As someone who graduated before then, this was a problem long before NCLB. It’s also a fundamental misunderstanding about the purpose of American schools. Schools have two goals:
- Babysit kids, so that both Dad and Mom can go to work
- Train kids to do arbitrary, bullshit tasks that don’t have any meaning or purpose
Any actual learning when I went to school was tertiary, and Critical Thinking was constantly shot down. If you want an A, you don’t Think Critically, you regurgitate what you were told, even if it was wrong.

