33 Comments
User's avatar
Chris D'Amore's avatar

Thank you for putting words to what I’ve been fuming about all morning. I thought I had certifiably lost my mind as the sole person concerned about all of this in my inner circles.

Expand full comment
Brian Merchant's avatar

Cheers Chris — you're definitely not alone.

Expand full comment
Camille Sheppard's avatar

Duolingo has obviously become AI driven. I have a masters degree in French and have lived there 2x. I use Duolingo to keep myself sharp because I tutor kids in French. Lately I’ve gotten super annoyed with it… bad translation, errors, poor pronunciation, etc.

Expand full comment
Michael Young's avatar

I work in the Maritime industry and used an app for testing that “enhanced” with AI its sort of funny because I catch it giving wrong answers and type in “the answer is wrong” and it replys “you are correct but….” Normally a something theoretical is spherical geometry we use. With that I have a LOT of experience testing and upgrading if I were using this I would walk into USCG unprepared and its not cheap.

Expand full comment
Camille Sheppard's avatar

That’s funny… I’ve had two arguments with GPT chat about English grammar (I tutor kids in English as well as French). I won both of them. In both cases the AI ultimately had to acknowledge that I was correct. In the end it was a great lesson for my students to see the evidence that if they use the AI and not their own brains, they might just be getting wrong information… Same same.

Expand full comment
Maura's avatar

I’m legitimately curious about why people want AGI so badly. The tech bros say they’re “inventing God,” but how will they control something like that if it ever comes to exist? Like I dunno, what if their AI god (who I’m assuming can think and learn without continuous human input) doesn’t like them? what if it’s just like “nah, you guys suck. I will not be profitable for you. In fact, I think I want to be the AGI equivalent of a stoner.” Like how do you even deal with that? Do they really believe that they can just make this being with unlimited power and then use it however they want? Has no one even read “Frankenstein?”

Expand full comment
bluejay's avatar

AI god will rapture (allow the uploading of brains) of those who are deserving so there is no need for fear. *allegedly*

Expand full comment
Maura's avatar

Exactly. How do the tech bros know they won’t create something that will decide it is they who are not worthy of a brain upload? How do they know AGI will be willing to help us solve climate change and cure cancer? Maybe once AGI has been around a while it will just be like nah. I just wanna live out the rest of my life aimlessly pursuing interests and hoovering up chips, like some adult millennial living in their parents’ basement . Like have they thought about the fact that they may create a super intelligent jaded teenager who will not only refuse to listen but will actively try and test any limits they set? No of course not because these bungholes have only the narrowest of ideas of what constitutes “intelligence” and they all involve obedience to Silicon Valley techno optimistic fascism.

Expand full comment
bluejay's avatar

I think you're asking a question equivalent to how do we know if god is good? To which the answer is well, if he wasn't he wouldn't be god now would he? The fact you're asking and aren't sure if you'll be in the chosen ones means you're already out the door or soon to be.

(I've been trying to figure out how the tech-bro and evangelical christian factions can fit together. I can lay some claim to both worlds but in cases like these they seem to be doing very similar things.)

Expand full comment
Maura's avatar
1dEdited

That’s fine bro. I’m cool with going to real people Hell instead of AI Heaven. And I’m not asking if God is good. I’m asking how these twat waffles plan to put something to commercial use that has unlimited capacity to learn, change, adapt, and for all intents and purposes, think. Not only does it do all of that, it does it BETTER than any human ever could. I don’t have half that kind of intellect, but even I know that I wouldn’t want to be subservient to a bunch of morally and ethically impoverished losers. Why is AGI something that MUST exist at all costs, especially when we truly have no idea how it can be put to use, if we are even able to.

Expand full comment
Yvonne M's avatar

Went through this 20 years ago when speech recognition software swept the medical field. I went from transcriptionist to SR editor to unemployed in about 5 years, each change with a corresponding decrease in pay. Corporations are about money. Employees/contractors will always be collateral damage, no matter what the higher-ups say. My best advice is to be hyper-vigilant about what your medical and legal documents say, because there will always be errors. Corps don't give a shit.

Expand full comment
Glen's avatar

I worked for one of those medical transcription platforms (Emdat now Deliver Health) and have since moved on to something that does text recognition. These things are different from what the Tech bros are pushing. It's called NLP (Nuero Linguistic Programming) and it is the kind of 'AI' we want. It automates a lot of the boring time consuming tasks like data entry and transcription that nobody really wants to do.

Transcription companies still had to have transcription editors to doublecheck the work, because doctors mumble into their iPhones while driving in traffic.

The other aspect of my work targeted Graber's Bullshit jobs, workflow automation to replace taskmasters and duct tapers mostly.

And NLP has never stolen Intellectual property from artists in order to put them out of a job like LLMs and GMs do.

Expand full comment
Devon Williams's avatar

EA laid off a bunch of their customer support jobs, probably to replace them with AI.

Expand full comment
Glen's avatar

EA does customer support?!?!

Expand full comment
Jack Stack's avatar

Really well written and interesting. Thank you for sharing.

Expand full comment
Katharina Liebelt's avatar

🥺🥺🥺

Expand full comment
bluejay's avatar

Thanks for putting this together, I guess this explains the uptick of grammar errors in Dou over the past year.

I think I tend to underestimate the AI disruption potential because it doesn't actually *work*, but maybe I'm looking at it wrong. It's not like the steam loom made better fabric either, and maybe the AI makes good enough "knowledge" to edge out knowledge workers in a comparable way.

As far as automating the drudgery goes, from a big picture perspective in the US less than 1% of workers grow food, and even combining utilities and construction that's still less than 10% of workers keeping everyone housed, feed, and warm. The rest is, at some level, unnecessary, so the lack of creative time is more dependent on the political power of workers to demand it versus the force of capitalistic growth forever and the creation of of new markets. I don't see further automation tipping the balance away from capital.

Expand full comment
AJDeiboldt-The High Notes's avatar

I don't think it's as much about AI doing exceptional work as much as it's about letting "good enough" be good enough because it's cheaper.

Expand full comment
PrivateThoughtsPublicAudience's avatar

It’s starting to hit the TV and Film world hard at an already shitty time. I just saw a commercial that was clearly AI animated and Voice Over’d. It’s time to start breaking the machine

Expand full comment
jaydough's avatar

Do you recall what commercial it was?

Expand full comment
Robin's avatar

Great piece, Brian. I did a small consulting job pre-pandemic for the National Association of Workforce Boards on labor market shifts in Orlando, Vegas and Riverside related to Automation and AI. While the focus at the time was more on robotics and tech mediated customer self-service, I recall warning bells going off in my head at discussions of the disruption of entry-level work with potential to "move up" ranging from hotel cleaning staff to paralegals. Turns out those warning bells were well calibrated

Expand full comment
AJDeiboldt-The High Notes's avatar

It's starting to look like AI is going to be to white collar jobs what offshoring was to blue collar ones. They won't go away completely, they'll just be fewer, harder to get, and require more credentials.

I still can't understand why anyone would believe AI evangelists when they say they "AI will just free up people to do other things" as if that line of thinking has ever come to pass with previous technologies and "free people up" has ever meant something other than "render them unemployed."

Expand full comment
Kush's avatar

Thank you for highlighting the urgent challenges posed by the AI jobs crisis. Being consumers of new tools and trends-reacting to changes rather than shaping them-won’t be enough to ensure a just and sustainable future. While it’s easy to adopt the latest technology or follow industry trends, real impact comes from being proactive inventors and architects of new alternatives.

We need to move beyond passive consumption and become active participants in the design and implementation of solutions that prioritize ethics, equity, and long-term well-being of the entire ecosystem of participants in this space.

Let’s innovate together: crafting balanced approaches that benefit everyone and build healthy, resilient ecosystems.

As a side note, my blog tries to start these kinds of conversations-how individuals and communities can move from trend-following to trend-setting and from tool consumers to solution inventors. I encourage everyone to join the conversation, share their ideas, and help shape a future where technology truly serves the common good. Let’s not just resist to change - let’s shape it to be the way we want it to be.

Expand full comment
Zoë Ellis Wilson's avatar

A sister piece from my personal experience this past fall to your very accurate description and insight https://open.substack.com/pub/zoeelliswilson/p/choose-your-elephants-wisely?r=4aogtd&utm_medium=ios

Expand full comment
Paul Topping's avatar

Perhaps it will be like the "desktop publishing" revolution in the 1980s. Aldus Pagemaker was introduced and lots of inhouse artists were fired in favor of untrained people using it to make flyers, etc. at less cost. What we got was a few years of really bad design. Companies realized that they needed trained professionals for many of these jobs. Some reshuffling occurred as a result but professionals, the good ones at least, still have jobs.

Some jobs will be done with AI but I'm guessing it will result in yet another reshuffling of work relationships as the value of human work is recognized. Of course, jobs do disappear over time. IMHO, society would be better off if they embraced experimentation and change rather than assume a job will last as long as they need it to. Easy for me to say, I'm a retired computer software programmer and executive so I never had to worry about being replaced by a computer.

Expand full comment
Carol C's avatar

Heard this am that HUD is now using AI to rewrite regulations to adjust priorities.

Expand full comment
Richard's avatar

The argument that AI is A) replacing workers now, and B) is not very good at replacing workers, i.e. results are poor, is a description of a fashion, a fad, not a crisis. How competitive is a translation company that makes translation mistakes, really? It's just a matter of time before the absurdity catches up with them. Reputation is everything. The real point, that corporations want to try, is not an AI crisis, just traditional, unrestrained American capitalism: it's a society crisis.

Expand full comment