24 Comments
User's avatar
Publis's avatar

I must say this was hard to read because so much of it resonated with me. What I am hearing from teachers I know (at all grade levels) is that students are no longer motivated to learn, and increasingly incapable of doing so in large part because so many of them have grown dependent on GenAI as soon as it comes along.

Meanwhile hiring managers that I speak to are telling prospective employees that they *only* need GenAI to succeed and even tech managers are pressured to pressure their employees to just accept whatever the AI gives them no questions asked. So students don't see a need to learn deep skills, nor do they feel safe doing so because the overwhelming message is that only the fastest prompter will survive.

It is a self-consuming degradation that is killing education, will stunt the next generation, and in time will kill the economy as a whole.

Brian Jordan's avatar

Thank you for your fine reporting and for fighting the good fight. I am an older person so I won’t be around to experience the dream-come-true of the greedy tech gods. So sad to see humans and human emotions and intelligence valued at zero. But don’t give up—as you have reported, many people see the utter insanity of AI and maybe the tech lords will be defeated.

it's an uncivil war's avatar

This is the outcome of devaluing education. When college shifted from being about educating people to being about getting a job, this was all that was necessary to undermine education. They want us to outsource our ability to think to AI. Sam Altman recently said: "We see a future where intelligence is a utility, like electricity or water, and people buy it from us on a meter."

Deplorable Commissar's avatar

" This is the outcome of devaluing education. "

Maybe education was devalued because the PTB knew AI was coming ?

Nic D's avatar

Thank you so much for this series. Your writing in general has helped me reconcile my own interest in and passion for technology with my Luddist beliefs and values. This series in particular has been so helpful for having more productive conversations with others about the societal harms of these technologies and how they are implemented. You’ve made me less of a Doomer.

I left the tech industry (UX Design) a few years back. AI-related problems were not the only reason, but they were the nail in the coffin.

Since then, I’ve worked in education and public libraries, and I’ve worked on-and-off in kids’ tech education my whole working life.

Man, it’s bleak. Many of these stories resonated with me, but I think what they’re missing is the overwhelming hopelessness for the future that so many students feel, and how that motivates them to cheat themselves of an education. AI isn’t directly making them uncurious or lazy, but it allows them to stay that way. So many of the students I have worked with have so little hope for the future. They have no motivation to try. It’s normal for kids to think school is pointless sometimes, but, anecdotally, the volume and intensity of those thoughts seems to have increased. It’s not helped by a decrease in digital literacy spurred on by the lingering impact of the digital native myth on education.

I’ve got mad respect for the young Luddites out there who haven’t just given up and are actively resisting. They’re a beacon of hope right now.

Kit's avatar

I also thought of the missing student voices while reading this, but specifically the students who want to have a meaningful educational experience and are being increasingly deprived of the opportunity to do so. I know younger kids for whom the utter emptiness of writing essays to a robot is deeply demoralizing. It robs them of the chance to receive meaningful feedback that takes them seriously as thinkers in formation. It undermines the trust and humanity in the student-teacher relationship. (These kids are, not coincidentally, strongly anti-AI.) And as higher ed institutions increasingly jettison any semblance of integrity and embrace the bullshit with open arms, the path becomes murkier. Where will the current and upcoming generations of college students find the people who will teach them to think, write, research if those people have all been forced to restructure their courses or just laid off entirely?

EA Mayes's avatar

These testimonies are overwhelming to read. AI, hyper-capitalism and fascism are the evil trio of mutually reinforcing monsters. But as AI takes the capitalists' 'logic' of cutting labor costs to its most extreme absurdity, humans will be excluded from this machine-perpetuated regime leading to two opposing societies. If we're in the human society we better start planning effective subversion of the machine regime. It's possible and our only hope. We might compare it to Iran's balsa-wood-drones confounding the missile interceptor systems deployed by the parasitic military industrial complex.

Taylor Smith's avatar

I am a full-time, tenured professor (music). I've been in my position since 2008 (a community college in San Diego, CA). In many ways, this was my “dream job.” Now, I am scanning the exits, looking for some kind of off-ramp.

The prevalence of AI is not the **only** reason for this, but it’s pretty high on the list. Unfortunately, I feel like I am in the minority on my campus in my skepticism toward AI. It seems most others have either been smitten by hype or are in the “it’s-the-way-of-the-future-so-we-should-get-on-board” camp. Of all people, why are **we** not pointing out all of the reasons to be skeptical, even antagonistic toward this “development?!” It’s quite baffling to me.

Online “teaching” was always a compromise (something I wish more of us were **also** willing to admit); now it borders on some kind of joke. My History of Jazz class last semester was mostly me calling people out, over and over, for using AI. I found myself glad to see someone write terribly or do poorly on an assignment; “at least they’re not using AI.”

My question for these students is: What’s the point? Why sign up for this class if your plan is to have a robot in the sky do everything for you? If this topic is so uninteresting or so unworthy of your time, why not take a different class? What are you in such a hurry to do rather than this? (I wish more professors and ed admins were asking these questions of themselves as well.)

CCW's avatar

what do the students say when you ask them these questions? often i've just gotten blank stares back, which feels hopeless

Taylor Smith's avatar

When I ask, it’s usually some combination of “I have to,” “for GE credit,” and the aforementioned blank stares. I try not to take their disinterest personally, but it’s hard; I’ve dedicated a significant part of my life to this stuff, after all. That “hopeless” feeling is a big part of why I am scanning the exits.

Disposable Poetry's avatar

Reading these stories, it seems to me that when AI use becomes universal, whoever is operating it will have absolute control over the truth. Absolute control, absolute power. No wonder so much is being invested.

AJDeiboldt-The High Notes's avatar

I wonder how many of those students would be comfortable flying on a plane piloted by someone who'd never actually practiced flying but instead used AI to "learn" how to do it. For the younger generations, it seems like mere competence is going to be currency when a lot of them will go out into the world not knowing as much as previous gens about the fields they've chosen because they just used AI to feign actual knowledge.

Keep up the great work Brian, this is important stuff!

Bryan Steele's avatar

Thanks brian, in addition there is one area of AI in education that is not getting enough discussion, and that's what AI use does to the human brain, over reliance on AI causes brain atrophy, the literal shrinking of brain matter. It's the old idea, use it or lose it. Postmodernism was the beginning and now AI is the final nail in the coffin of the American mind.

Golden Hue's avatar

I am about 18 months away from retirement in the higher education sector and I feel like it is in the nick of time. Whenever I am forced to use genAI it feels as if I’m participating in my own subjugation. I have been asked to incorporate something about AI into my teaching—among other things, I teach learners about finding and evaluating information sources. I don’t want to do this because I want to interact with AI as little as possible, myself.

AI is also affecting the interview process. I’ve heard one person describe being subjected to a job interview conducted by a chatbot. I also interview prospective students for the institution where I work. Since the pandemic, we conduct these virtually. Now students are using ChatGPT to generate answers to our interview questions during the interview. We cannot see what they’re doing, of course, but they’ll say, “can I have a moment to think about that question?” Then about a 30 second wait, their eyes on the screen, then a little jolt of their head when the answer appears and they read it off. This “having a moment to think” never happened before 2024.

Brian Roach's avatar

As someone who works at a university press, with one kid in college and another heading next year, this was a very tough read. I don't have any answers but I appreciate you and many others continuing to shine a light on these issues, and hope we can work our way out of this nightmare sooner rather than later!

Leon S's avatar

Brian, these are so incredibly depressing to read through, the whole series... but I make sure to read every single one. All these voices deserve to be heard. Thank you.

Narrative Myth's avatar

To be fair though, teachers do repeat the same material year after year to their students. The material is based on books and has basically in many cases not changed in like 50 years or more (take math, chemistry or physics for example, or literature and history). A teacher cannot individualize their teaching for each student, an AI chat can. AI is a revolution in terms of personalized learning in my view. No wonder it puts teachers in a difficult spot.

Emil's avatar

This was a sad and grim reading, and there are many directions a discussion about these issues could go into, but for me, one sentence stood out the most, as it’s also something I have been sitting with while working on education projects:

“Increasingly my students genuinely do not understand why they should not use AI anyway... what is the point of ‘wasting’ days researching and writing an essay when the AI version will be as good or even better?”

So I wonder, can we, collectively as society (or at first just education professionals), actually agree on “what is the point” with these types of tasks in a compelling way so that a young person, currently experiencing the wildly exciting and overwhelming technological development, can understand and see themselves needing in the future?

From the young person’s perspective, almost no one in their family or immediate circle of influence writes essays for a living, engages in contemplative research, or enjoys reading complex works of writing for the sake of deep analysis.

We as adults (and I’m saying this loosely about myself in my mid 30s but with three kids), with complex lived experiences, growing up and maturing in drastically different times, can understand the value of critical thinking in a polarising world, clear articulation of thoughts for life and carrer development, and the importance of reading, as well as being able to analyse what you read.

But most of the time, we had to learn these life lessons the hard way or because we were curious about something that required us to learn these skills in the process.

These teens and early 20s have grown up in a fully interconnected, information-saturated, technology-driven world where an answer to any question can be called upon in a matter of seconds, and a tutorial for almost any task can be watched while sitting on a toilet, so their ability to envision a time where that’s not the case is close to impossible.

On top of that, they are constantly surrounded by adults and lecurers who come from different generations and, for most part, are fairly illiterate when it comes to technology and young people's culture, while hell-bent on proving that their hard-knocks way of life is the best teacher, so it’s completely understandable that they would rebel, cheat and bypass the system that they do not believe in or haven’t been convincingly shown value of.

I’m no AI-evangelist, I see both the evil and the benefit that comes with these emerging tools, but if we want the practice of acquiring education in an institution to have some intrinsic meaning for young people, the professors and lecturers have to find new ways to engage learners’ curiosity so that they want to do the assignments themselves.

Students need to be able to understand “what’s the point” of all of this education and see its relevance for their future, without being presented with recycled versions of the “you’ll not be able to think critically” and “you’ll understand when you get older and go out in the big world.”

Their, and to be fair, all of our future is highly uncertain, so for the students, the AI tools become a coping mechanism for getting by, so their energy and mental capacity can be directed towards bigger and more personal questions that they have to grapple with.

Michael S. S. Curl's avatar

The end of university and education, the age of ignorance begins.

Ebenezer's avatar

"And now fascism is mainstream, almost unquestioned, inevitable."

The two largest protests in US history both happened in the past year: https://en.wikipedia.org/wiki/List_of_protests_and_demonstrations_in_the_United_States_by_size

If that's "almost unquestioned" then I am a tortoise. Glad to see this "educator" isn't in the business of teaching students critical thinking skills.

Honestly that entry was a nice reminder that ultimately, every person who contributed to this post is someone who has a financial incentive to secure their job... but that doesn't mean they are necessarily any good at that job, or someone you would actually want educating the youth.