I am so grateful for your tireless work here. I often feel hopeless about the future, and while your reporting is often bleak, the very fact that it exists gives me hope.
Each time I’ve clicked on an article to learn about AI it’s either excited tech lingo or a lyrical take on the decline of civilization for vaguely described reasons. It’s so informative and nuanced to read this text because it shows how AI is being used as a weapon to undercut not so much creativity in itself but its economy. People living off their talent and hybrid handmade / digital knowledge goes against the dream of transhumanism, a sci-fi version of a cyborg human making stuff by shouting prompts at a screen. It also presents itself as an American reality: most of the jobs producing the content that the world is consuming comes from the US, and its economy is also, in many cases, US based. It’s strange to watch an economy eat is babies.
I have some follow-up questions: what kind of art is being used to train AI? As someone who works in contemporary art, I see quite a few things being completely untouched by whatever AI can do, which is, at the very best, 3D printing to copy artists. Because AI is learning from digital content: what is threatening, in that case? Using anime, gaming characters, patterns for design, etc.?
I will offer of sincere and perhaps grain of sandy thought about this: without Adobe Express I would not have been able to release my podcast. I was told that podcast art is very important, and that I had to come up with something attractive. I did not. I did the best I could. But thanks to AI I was able to produce my podcast with a recognizable design that people can identify. I was told that it didn’t have to be pretty. People don’t look for pretty, they look for info that they click on. (That’s why YouTube is a dagger to the heart of aesthetics). However, I daydream about hiring. Graphic designer, a musician, and people who actually know how to do what I’m doing on my own. If I’m business savvy that’s my goal. At no moment in my waking late hours of editing did I say to myself “if only I could hire a really good AI”. But on the other side I worry: who are the people I’ve been dreaming of? I don’t see intersting content around. If anything, design has been completely uninteresting of late. Writing is horrible (is Netflix already using AI writers?) The translations of my texts by humans have been more than questionable. What is happening? As you say, the pre-AI world was already strangely samey, and threatened by many other issues, and I wonder if we haven’t been asleep at the wheel, fascinated by proto-slop produced by us, for us, as great art and great entertainment, but which is a copy of a copy of a copy: you know, the series, the sequel, the remake. I. Remember the talk about sampling and that produced one of the best albums of the Beastie Boys, Paul’s Boutique. The use of obviously quoted material, to gain rhythm, energy, etc. but making the same thing pretending it’s new… that a whole different ballgame and we seem to have been lost in that for a while, before AI was even out there. I hope this makes some sort of sense. What I mean to say is that critics, philosophers, political science, epistemology in general has failed to really produce a valid thought about the reality of images and now we’re all the more lost for it. Thanks so much for your text! And apologies for such a meandering comment.
Thank you Natalie — there's much that's bleak right now, but also much to be hopeful for. People are organizing, fighting, coming together. Better futures are possible.
Existing media companies, books and TV and movies, pay royalties or residuals to creators whose copyrights are used for money. This process is automated and works well. There's no reason why Altman should be exempt from this NORMAL and ROUTINE procedure! If Altman had to pay the same level of royalties as other publishers, his product would be considerably less attractive to the VC monsters.
Totally. But the most likely outcome if copyright law *were* enforced, would probably be the AI companies removing the most expensive IP from their datasets and keeping the slop engines running without them; the greater problem of mass automation confronting most working artists will persist in that case, which is why more work is needed to preserve the creative economy in my opinion.
The law is clear. It just needs to be enforced. Making copyright holders fully responsible for chasing thieves is the ultimate betrayal of the Constitutional guarantee of the right to exploit one's creative work.
Austin Kleon has a great quote in one of his books about importance of citing your sources and how failing to do so robs everyone who views your work in some way. I think it's from "Stealing Like an Artist" (Some AI companies steal, but not the way artists do. Artists run the work they consume through a *human operating system* before they use it in their own work, which is accountable to copyright & IP law.)
"When we make the case for crediting our sources, more of us concentrate on the plight of the original creator of the work. But that's only half of the story—if you fail to properly attribute work that you share, you not only rob the person who made it, you rob all the people you've shared it with. Without attribution, they have no way to dig deeper into the work or find more of it."
"So, what makes for great attribution? Attribution is all about providing context for what you're sharing: what the work is, who made it, how they made it, when and where it was made & why you're sharing it. Why people should care about it, and where people can see some more work like it. Attribution is about putting little museum labels next to the stuff you share.
***"All of this raises a question: what if you want to share something and you don't know where it came from or who made it? The answer: don't share things you can't properly credit. Find the right credit, or don't share." *****
That's a pretty compelling and short argument, which essentially boils down to: if copyright law is used to stop AI training, then there's going to be an enormous amount of friendly fire towards more legitimate uses of internet scraping (like Internet Archive), so it's not worth it. Additionally enforcing copyright law won't stop artists from being exploited because most copyright is held by corporations that already ruthlessly exploit artists (essentially things were already very bad). I'm not 100% convinced though because a lot of people I like seem to have a serious problem with the EFF's position on this.
I just haven't yet seen a direct response to the EFF's arguments yet.
I should write a longer post about this, and I do think Cory/the EFF make some good points. To me, it comes down to this: Copyright is maybe not the *ideal* law to protect creative workers from AI automation, but it's what we have right now, and creative workers need protections stat.
Further, there's no reason a law can specifically target AI companies (as AB 412 does) or even certain kinds of AI companies (ones making a certain amount of revenue etc, as AI safety bills have done) and exempt operations like the Internet Archive or nonprofits that aren't producing competing works. I think it's entirely possible to write a law, like AB 412, that wouldn't impact the Internet Archive much.
And no, copyright law certainly won't stop creative labor from being exploited, and Cory and the EFF are absolutely right that media conglomerates hoard IP and weaponize copyright law. But let's not forget copyright law is also meant to serve independent artists and working creatives, too, and in the face of the AI companies mass automation efforts, it's one of the few protections they have.
As long as we're dealing with a capitalist system in which artists et al hope to get paid for their creative labor — and now one in which AI companies are seeking to devalue that work wholesale with mass automation — copyright, ie, a claim that stakes out creative/intellectual ownership of a work that costs real human labor to create, seems like a necessary bulwark.
Cory has a legit point about how corporations hoover up copyrights and IP rights, but there are still a lot of independent artists, musicians, and writers who deserve protections for their work. I’ve yet to hear from Cory how that can happen sans copyright laws.
Okay, I guess I'm an extremist. And a Luddite. I do use a computer, but use a cellphone only when the landline is down, or when I'm on the road (which is rarely). But I'm an abolitionist when it comes to AI--what I wanna know is why we don't just kill the frackin' thing? COULD we do that by a unanimous refusal to use AI? I have no intention of using it ever, if I can help it. I'm disappointed to read here that EFF opposed that little bill, that would somewhat address one of the harms. Aside from Trump's Royal Proclamation, though, Newsom vetoing a bill that would have prohibited companies from selling AI products THAT WOULD HARM KIDS (!) tells us that there is likely no chance of getting any sensible regulation of this monster. But I'm afraid that even if a consumer boycott COULD kill it, few agree with me here. Many say they use AI and it helps them. Against that important benefit, all we have to argue in favor of banning AI is:
*it requires enormous data centers which require huge amounts of electricity, causing coal plants to stay open and enough new gas plants and perhaps nukes to make catastrophic climate change a likelihood--worst case scenario, human extinction along with most other "advanced" life forms;
*they also require huge amounts of water, thus threatening the water supply for nearby residents, who are likely to pay higher water and power bills to accommodate data centers, and if there are shortages, likely the data centers will get priority
*and the data centers also create air pollution and noise and light pollution which are terrible for nearby communities, which often have no rights;
*and ignoring climate change goes hand-in-hand with ignoring the equal threat of extinction due to accelerating biodiversity loss--this is not a matter of esthetics or morality alone--when insect populations are going down by 1 to 2% a year, in a couple of decades we are likely to reach a point where ecosystems collapse, and we depend on healthy ecosystems, so again, human extinction;
*data centers require PFAS, which along with plastic is likely the reason sperm counts are falling by about 1% a year, which is another way extinction threatens--our own and other species that use sperm to reproduce;
* then there's the likelihood that a key reason for all this is to build a One Ring To Rule Them All, a giant data center where all the info collected on all of us, from everything the Muskrats scraped from the IRS, Social Security and other government agencies, to everything we've ever posted or liked on social media to our location at any time via our cellphones, along with who we talked to...what AI can do with that enormous amount of data is comb through it to identify the potential leaders when we finally rise up;
*then there is this whole "race with China" bullshit, much of which is about the use of AI for space-based weapons, needed to create the dystopian world we've seen in too many movies; AI can also provide the guidance for drones to take out domestic enemies of the powerful;
*and then there are the things AI is good for, like creating deepfake porn videos starring classmates or ugly ones featuring exes and other enemies; creating artificial friends and lovers for mentally ill people, exacerbating their problems; creating beautiful illustrations (ripped off from human artists' work).
So, all that is the downside. Tell me again, what positive benefits it brings, that outweighs all that?
Oh yeah, the fallback--"it's inevitable, it can't be stopped, best we can do is mitigate the harm a bit." Which makes me think of two things--Maggie Thatcher's TINA and the advice once given rape victims to "lay back and try to enjoy it."
I suppose I should apologize for the length of this post, but dammit, this all needs to be said, and if there is some enormous positive benefit of AL that justifies its continued development, someone tell me what it is!
You're correct there's no reason for this AI "technology" to exists at all. Even the race with China is pretty spurious. At least the space race of the last cold war had clear military implications for the winner and pretty obvious advantages i.e. satellite surveillance. I can't see any military advantage for AI that's not already possible with existing technology. The ring as you say will be built anyway regardless of AI.
I went through a similar situation when speech recognition (SR) software became a thing. I was a medical transcriptionist for 34 years, was drafted into training the SR engine, and then SR eliminated my job. Yeah, it was seen as a beneficial tool for physicians, etc., but it also allowed hospitals to cut costs and increase profits. It also introduced an incredible amount of slop into medical records. Fatal slop, in some instances. But - what's a small percentage of lives when you're raking in sky high profits? Cost of doing business.
They want us to see AI as a tool. BUT - they're having to do a lot of explaining about how great a tool this is for the rank and file, and to me that's a giant tell it's absolutely about them making money and nothing else.
I sincerely hope there IS a way, but I'm not holding my breath. After 67 years on this planet I've seen that money is God in this world and those who have it usually get their way. I wish all the best for those who see what's happening and are fighting for their rights/livelihoods. I hope your efforts are more successful than ours were.
Thanks for this Yvonne — and I'd add that if you're up for it and so inclined, I'd love to hear your story in greater detail if you'd like to share: AIKilledMyJob@pm.me. Sounds like you have some important thoughts and experiences to share there.
I looked up the "No Robobsses" act and was surprised to find both my state reps had voted for it. They're pretty conservative tech-loyal reps by California standards. It passed with a veto-proof margin https://calmatters.org/politics/2024/10/californa-veto-overrides/
Another powerful article. I'm very grateful to have found your Substack. Your book is on my essential to read next year list, and I fully intend to become a paid subscriber, you deserve it.
Echoing what others said, other than for the greed of the wealthy, AI is totally unnecessary. We don't NEED to create our own Buzz Lightyear content for our kids, we don't NEED to create our own AI versions of popular songs, we don't NEED to create our own books from existing ones. There are billions of movies, books, songs, etc. that have already been created and will continue to be. We need to continue to advocate for humans. I know there are arguments outside of the arts but I don't think they're all that different.
As I say anytime this comes up, "AI is a disease." Blessings to people like you shining a light and showing us a possible future!
"Bob Iger says he hopes to take user-created AI content featuring Mickey Mouse, Marvel and Star Wars characters, and feature it on the streaming platform Disney+. Sam Altman said parents could make digital content for their kids with Buzz Lightyear."
I am so grateful for your tireless work here. I often feel hopeless about the future, and while your reporting is often bleak, the very fact that it exists gives me hope.
Each time I’ve clicked on an article to learn about AI it’s either excited tech lingo or a lyrical take on the decline of civilization for vaguely described reasons. It’s so informative and nuanced to read this text because it shows how AI is being used as a weapon to undercut not so much creativity in itself but its economy. People living off their talent and hybrid handmade / digital knowledge goes against the dream of transhumanism, a sci-fi version of a cyborg human making stuff by shouting prompts at a screen. It also presents itself as an American reality: most of the jobs producing the content that the world is consuming comes from the US, and its economy is also, in many cases, US based. It’s strange to watch an economy eat is babies.
I have some follow-up questions: what kind of art is being used to train AI? As someone who works in contemporary art, I see quite a few things being completely untouched by whatever AI can do, which is, at the very best, 3D printing to copy artists. Because AI is learning from digital content: what is threatening, in that case? Using anime, gaming characters, patterns for design, etc.?
I will offer of sincere and perhaps grain of sandy thought about this: without Adobe Express I would not have been able to release my podcast. I was told that podcast art is very important, and that I had to come up with something attractive. I did not. I did the best I could. But thanks to AI I was able to produce my podcast with a recognizable design that people can identify. I was told that it didn’t have to be pretty. People don’t look for pretty, they look for info that they click on. (That’s why YouTube is a dagger to the heart of aesthetics). However, I daydream about hiring. Graphic designer, a musician, and people who actually know how to do what I’m doing on my own. If I’m business savvy that’s my goal. At no moment in my waking late hours of editing did I say to myself “if only I could hire a really good AI”. But on the other side I worry: who are the people I’ve been dreaming of? I don’t see intersting content around. If anything, design has been completely uninteresting of late. Writing is horrible (is Netflix already using AI writers?) The translations of my texts by humans have been more than questionable. What is happening? As you say, the pre-AI world was already strangely samey, and threatened by many other issues, and I wonder if we haven’t been asleep at the wheel, fascinated by proto-slop produced by us, for us, as great art and great entertainment, but which is a copy of a copy of a copy: you know, the series, the sequel, the remake. I. Remember the talk about sampling and that produced one of the best albums of the Beastie Boys, Paul’s Boutique. The use of obviously quoted material, to gain rhythm, energy, etc. but making the same thing pretending it’s new… that a whole different ballgame and we seem to have been lost in that for a while, before AI was even out there. I hope this makes some sort of sense. What I mean to say is that critics, philosophers, political science, epistemology in general has failed to really produce a valid thought about the reality of images and now we’re all the more lost for it. Thanks so much for your text! And apologies for such a meandering comment.
Thank you Natalie — there's much that's bleak right now, but also much to be hopeful for. People are organizing, fighting, coming together. Better futures are possible.
Existing media companies, books and TV and movies, pay royalties or residuals to creators whose copyrights are used for money. This process is automated and works well. There's no reason why Altman should be exempt from this NORMAL and ROUTINE procedure! If Altman had to pay the same level of royalties as other publishers, his product would be considerably less attractive to the VC monsters.
Totally. But the most likely outcome if copyright law *were* enforced, would probably be the AI companies removing the most expensive IP from their datasets and keeping the slop engines running without them; the greater problem of mass automation confronting most working artists will persist in that case, which is why more work is needed to preserve the creative economy in my opinion.
The law is clear. It just needs to be enforced. Making copyright holders fully responsible for chasing thieves is the ultimate betrayal of the Constitutional guarantee of the right to exploit one's creative work.
Austin Kleon has a great quote in one of his books about importance of citing your sources and how failing to do so robs everyone who views your work in some way. I think it's from "Stealing Like an Artist" (Some AI companies steal, but not the way artists do. Artists run the work they consume through a *human operating system* before they use it in their own work, which is accountable to copyright & IP law.)
"When we make the case for crediting our sources, more of us concentrate on the plight of the original creator of the work. But that's only half of the story—if you fail to properly attribute work that you share, you not only rob the person who made it, you rob all the people you've shared it with. Without attribution, they have no way to dig deeper into the work or find more of it."
"So, what makes for great attribution? Attribution is all about providing context for what you're sharing: what the work is, who made it, how they made it, when and where it was made & why you're sharing it. Why people should care about it, and where people can see some more work like it. Attribution is about putting little museum labels next to the stuff you share.
***"All of this raises a question: what if you want to share something and you don't know where it came from or who made it? The answer: don't share things you can't properly credit. Find the right credit, or don't share." *****
- From the book "Show Your Work" by Austin Kleon
What's the response to Cory Doctorow's (which I presume is close to the EFF's) argument against using copyright law to regulate AI training?
https://pluralistic.net/2025/12/05/pop-that-bubble/#u-washington
That's a pretty compelling and short argument, which essentially boils down to: if copyright law is used to stop AI training, then there's going to be an enormous amount of friendly fire towards more legitimate uses of internet scraping (like Internet Archive), so it's not worth it. Additionally enforcing copyright law won't stop artists from being exploited because most copyright is held by corporations that already ruthlessly exploit artists (essentially things were already very bad). I'm not 100% convinced though because a lot of people I like seem to have a serious problem with the EFF's position on this.
I just haven't yet seen a direct response to the EFF's arguments yet.
I should write a longer post about this, and I do think Cory/the EFF make some good points. To me, it comes down to this: Copyright is maybe not the *ideal* law to protect creative workers from AI automation, but it's what we have right now, and creative workers need protections stat.
Further, there's no reason a law can specifically target AI companies (as AB 412 does) or even certain kinds of AI companies (ones making a certain amount of revenue etc, as AI safety bills have done) and exempt operations like the Internet Archive or nonprofits that aren't producing competing works. I think it's entirely possible to write a law, like AB 412, that wouldn't impact the Internet Archive much.
And no, copyright law certainly won't stop creative labor from being exploited, and Cory and the EFF are absolutely right that media conglomerates hoard IP and weaponize copyright law. But let's not forget copyright law is also meant to serve independent artists and working creatives, too, and in the face of the AI companies mass automation efforts, it's one of the few protections they have.
As long as we're dealing with a capitalist system in which artists et al hope to get paid for their creative labor — and now one in which AI companies are seeking to devalue that work wholesale with mass automation — copyright, ie, a claim that stakes out creative/intellectual ownership of a work that costs real human labor to create, seems like a necessary bulwark.
Cory has a legit point about how corporations hoover up copyrights and IP rights, but there are still a lot of independent artists, musicians, and writers who deserve protections for their work. I’ve yet to hear from Cory how that can happen sans copyright laws.
Okay, I guess I'm an extremist. And a Luddite. I do use a computer, but use a cellphone only when the landline is down, or when I'm on the road (which is rarely). But I'm an abolitionist when it comes to AI--what I wanna know is why we don't just kill the frackin' thing? COULD we do that by a unanimous refusal to use AI? I have no intention of using it ever, if I can help it. I'm disappointed to read here that EFF opposed that little bill, that would somewhat address one of the harms. Aside from Trump's Royal Proclamation, though, Newsom vetoing a bill that would have prohibited companies from selling AI products THAT WOULD HARM KIDS (!) tells us that there is likely no chance of getting any sensible regulation of this monster. But I'm afraid that even if a consumer boycott COULD kill it, few agree with me here. Many say they use AI and it helps them. Against that important benefit, all we have to argue in favor of banning AI is:
*it requires enormous data centers which require huge amounts of electricity, causing coal plants to stay open and enough new gas plants and perhaps nukes to make catastrophic climate change a likelihood--worst case scenario, human extinction along with most other "advanced" life forms;
*they also require huge amounts of water, thus threatening the water supply for nearby residents, who are likely to pay higher water and power bills to accommodate data centers, and if there are shortages, likely the data centers will get priority
*and the data centers also create air pollution and noise and light pollution which are terrible for nearby communities, which often have no rights;
*and ignoring climate change goes hand-in-hand with ignoring the equal threat of extinction due to accelerating biodiversity loss--this is not a matter of esthetics or morality alone--when insect populations are going down by 1 to 2% a year, in a couple of decades we are likely to reach a point where ecosystems collapse, and we depend on healthy ecosystems, so again, human extinction;
*data centers require PFAS, which along with plastic is likely the reason sperm counts are falling by about 1% a year, which is another way extinction threatens--our own and other species that use sperm to reproduce;
* then there's the likelihood that a key reason for all this is to build a One Ring To Rule Them All, a giant data center where all the info collected on all of us, from everything the Muskrats scraped from the IRS, Social Security and other government agencies, to everything we've ever posted or liked on social media to our location at any time via our cellphones, along with who we talked to...what AI can do with that enormous amount of data is comb through it to identify the potential leaders when we finally rise up;
*then there is this whole "race with China" bullshit, much of which is about the use of AI for space-based weapons, needed to create the dystopian world we've seen in too many movies; AI can also provide the guidance for drones to take out domestic enemies of the powerful;
*and then there are the things AI is good for, like creating deepfake porn videos starring classmates or ugly ones featuring exes and other enemies; creating artificial friends and lovers for mentally ill people, exacerbating their problems; creating beautiful illustrations (ripped off from human artists' work).
So, all that is the downside. Tell me again, what positive benefits it brings, that outweighs all that?
Oh yeah, the fallback--"it's inevitable, it can't be stopped, best we can do is mitigate the harm a bit." Which makes me think of two things--Maggie Thatcher's TINA and the advice once given rape victims to "lay back and try to enjoy it."
I suppose I should apologize for the length of this post, but dammit, this all needs to be said, and if there is some enormous positive benefit of AL that justifies its continued development, someone tell me what it is!
There has to be a way, and abolition is one of them.
You're correct there's no reason for this AI "technology" to exists at all. Even the race with China is pretty spurious. At least the space race of the last cold war had clear military implications for the winner and pretty obvious advantages i.e. satellite surveillance. I can't see any military advantage for AI that's not already possible with existing technology. The ring as you say will be built anyway regardless of AI.
unless collapse comes before they can finish it--that's actually my hope now.
Great work as usual 👏🏻👏🏻👏🏻
I went through a similar situation when speech recognition (SR) software became a thing. I was a medical transcriptionist for 34 years, was drafted into training the SR engine, and then SR eliminated my job. Yeah, it was seen as a beneficial tool for physicians, etc., but it also allowed hospitals to cut costs and increase profits. It also introduced an incredible amount of slop into medical records. Fatal slop, in some instances. But - what's a small percentage of lives when you're raking in sky high profits? Cost of doing business.
They want us to see AI as a tool. BUT - they're having to do a lot of explaining about how great a tool this is for the rank and file, and to me that's a giant tell it's absolutely about them making money and nothing else.
I sincerely hope there IS a way, but I'm not holding my breath. After 67 years on this planet I've seen that money is God in this world and those who have it usually get their way. I wish all the best for those who see what's happening and are fighting for their rights/livelihoods. I hope your efforts are more successful than ours were.
Thanks for this Yvonne — and I'd add that if you're up for it and so inclined, I'd love to hear your story in greater detail if you'd like to share: AIKilledMyJob@pm.me. Sounds like you have some important thoughts and experiences to share there.
I looked up the "No Robobsses" act and was surprised to find both my state reps had voted for it. They're pretty conservative tech-loyal reps by California standards. It passed with a veto-proof margin https://calmatters.org/politics/2024/10/californa-veto-overrides/
Another powerful article. I'm very grateful to have found your Substack. Your book is on my essential to read next year list, and I fully intend to become a paid subscriber, you deserve it.
Echoing what others said, other than for the greed of the wealthy, AI is totally unnecessary. We don't NEED to create our own Buzz Lightyear content for our kids, we don't NEED to create our own AI versions of popular songs, we don't NEED to create our own books from existing ones. There are billions of movies, books, songs, etc. that have already been created and will continue to be. We need to continue to advocate for humans. I know there are arguments outside of the arts but I don't think they're all that different.
As I say anytime this comes up, "AI is a disease." Blessings to people like you shining a light and showing us a possible future!
Cheers Brian, much appreciated.
"Bob Iger says he hopes to take user-created AI content featuring Mickey Mouse, Marvel and Star Wars characters, and feature it on the streaming platform Disney+. Sam Altman said parents could make digital content for their kids with Buzz Lightyear."
https://kotaku.com/on-youtube-millions-watch-shows-for-children-made-enti-1788282618
ha!
The Nephew is obsessed!