The great and justified rage over using AI to automate the arts
Bosses are using AI to cut corners and kill jobs in gaming, and gamers, workers, and fans are outraged. The reason why is simple.
Hello hello and welcome to another edition of BLOOD IN THE MACHINE, a newsletter about big tech, AI, labor, and power. This thing is free to read, so please feel free to sign up below if you have not already (thank you!). It’s made possible by those of you who pay to subscribe, an act that honestly means a great deal—and makes the continuation of this work possible. Onwards, and keep those hammers at the ready.
So this week I had two long stories in WIRED, both of which had been in the works for a minute—one is about an ex-Google privacy engineer, Tim Libert, who built a search engine for uncovering privacy violations on the web after frustrations with his former employer boiled over. The other is about the impact AI is having on the video games industry. Namely, certain studios like Activision, which makes the Call of Duty games, are already embracing it, and replacing human work with the AI-generated variety.
It feels like both stories have been well received, and I hope you’ll check them each out. But the piece on AI in gaming has blown up in a way that I think speaks to where our anxieties with tech truly lie at the moment. There was a time—say, 10 years ago, in the Snowden era—when a former engineer at Google essentially blowing the whistle on the company behind the biggest search engine in the world and its apparent lack of interest in crucial privacy issues might have begot a whole news cycle. And while there are lots of people who still care deeply about privacy, it’s clear that this moment is engulfed in fretting over AI, and especially its threat to jobs, identity, and security.
In the wake of the AI in video games piece, I’ve had so many people from so many walks and backgrounds—translators, artists, voice actors, animators, illustrators, etc—reaching out, emailing, texting, DMing, tweeting. Their bosses want to use AI, too. This is happening to them, too. Or it already did, or it’s about to, or they’re worried it will. It has struck a nerve, and the outpouring of anger, opposition to generative AI, and solidarity for fellow workers has been pretty overwhelming.
The difference is probably that privacy—while an urgent concern, and folks still find it deeply creepy when they learn the extent to which tech companies gather and sell data about them—feels right now less existential a threat. (It’s also been more widely litigated in the court of public opinion, perhaps.) It’s less pressing than the prospect that you might lose your job if your manager decides to buy into enterprise generative AI software. Which is also why this is just such a tricky issue, top to bottom, and why I’ve written thousands of words trying to suss out just how worried we all should be about management using AI to eliminate our jobs.
As the AI in gaming piece revealed, studios and bosses *very much are* using AI to justify cutting corners, shrinking departments, and even, essentially, replacing workers with an AI service by hiring workers familiar with the AI programs to churn more stuff out, faster. And yet! It’s still very much an open question whether or not these changes will be permanent, or permanently used as leverage against workers, or will pass in the wind after the AI systems fail to improve to a point that studios find their output satisfactory, and they turn out to be more trouble and expense than they’re worth.
In other words: Even now, a year and a half into the AI boom, it’s *still* unclear whether a) this will be a truly transformative and disruptive force, at least in the eyes of employers and executives, b) it will stick around, propped up by reams of venture capital for years making a middling kind of impact in different sectors, a destabilizing but not totally destructive force for workers, or c) collapse in a spectacular bursting bubble, and going the way of the metaverse or web3.
But underlying all this, and the root cause for much of the most vociferous despair and outrage, I think, is the uniquely depressing fact that we are having some of these conversations in the first place, at all. The pitch of automation has historically been that it will do the dangerous, dirty, and dull jobs so that we humans can focus on the stuff that allows for human flourishing. So why are we here, watching concept artists lose their jobs to software trained on previous artists’ work, auto-generating output at the press of a button? Why are we automating the job of translators, who have a unique knowledge of their local language and culture, and can artfully translate works into a new medium with its meaning intact? Why do we want to live in a world where instead of an actual voice actor we have AI voice actors synthesized into a mush from all those who came before them?
Why do we want machines to do this stuff, the good stuff, the stuff that gives human life human value? The answer is: We don’t. Almost no one does, except for dead-eyed corporate executives, opportunistic founders and tech trend boosters, and those who have an antipathy for the creative arts for whatever reason. That accounts for about 1% of the population, by my impeccable calculations. The rest of us see this impulse for what it is: A naked drive to profit at the expense of people doing artistic work. The stuff approximately 85% of all dystopian futures have been imagined in order to critique. Now, I don’t think *anyone’s job* should be automated without their input, cooperation, or consent, for the obvious reasons, but it perhaps cuts even deeper when we’re automating the arts. (Or education).
It’s like the author Joanna Maciejewaska put it in a viral tweet a bit back: “You know what the biggest problem with pushing all-things-AI is? Wrong direction. I want AI to do my laundry and dishes so that I can do art and writing, not for AI to do my art and writing so that I can do my laundry and dishes.” But as long as we have expansive corporate control over creative enterprises, you’re going to see studios and executives inking deals with automation software companies.
In an earlier draft of the WIRED piece (it was at one point 2,000 or so words longer lol) I considered how a number of *engineers* in the gaming industry were happy to embrace AI; it made writing certain kinds of code faster and easier, and helped them bring to life the concepts drawn up by the artists. The problem is, given that bosses ultimately dictate how AI is and is not used, it’s extremely hard to find satisfactory best practices for AI in a workplace, and so guidelines are too broad or general and it sows discontent among the artists and designers who tend to more uniformly oppose it.
There is no way, currently, for most workers to register consent for the use of a given form of AI in the workplace. The best vehicle at our disposal is a union—the WGA writers won what is so far one of the only truly potent contracts managing the use of AI in terms of workers interests—but even that’s tricky. The major entertainment union IATSE, which has been involved in organizing the gaming industry, has weathered a storm of criticism for accepting language on a recent contract that, in many workers’ views, didn’t do nearly enough to protect creative laborers from bosses using AI to eliminate or degrade their jobs.
But games workers—and illustrators, designers, translators, writers, voice actors, coders, whomever—shouldn’t stop trying. Tough as it is out there, anyone who cares about living in a world where humans can make a living doing creative work, where human work is valued, should keep having these conversations, growing solidarity, pushing back when necessary. But it is tough: One worker, who goes by Rick the Luddite on X, was fired after he protesting his company’s use of generative AI last week, in another story that went viral. Rick, I dare say, is a hero. So, I think, are all of the many video game workers who came forward, anonymously or publicly, at great personal risk, to talk about what bosses are using AI to do to the field that they love. And there *is* hope: World of Warcraft workers just unionized, and so did the Bethesda games studio. They can pick up the thread that IATSE let dangle.
We have to speak up, now’s the time—OpenAI may be losing $5 billion a year, but we can’t bet on its bubble bursting. It could be propped up, like Uber, for a decade or more. We have to decide for ourselves how we want our technological destiny to be shaped—unless we want the art-automating dystopians to do it for us.
There is also this to consider: Washing Dishes Is a Really Great Stress Reliever, Science Says https://time.com/4056280/washing-dishes-stress-relief-mindfulness/ We have lost the owner's manual for inhabiting human consciousness.
It’ll be interesting to see the next level impacts of this. If AI eliminates jobs in game production will those people then use lower production barriers to create their own gaming offerings and increase the competition? Will there be a premium for “human only” works the way there is with artisan products now?