When Generative AI Backlash Dominates the Conversation! | 02/07/2025
11-bit studios is the latest developer to face a community backlash
The AI and Games Newsletter brings concise and informative discussion on artificial intelligence for video games each and every week. Plus summarising all of our content released across various channels, like our YouTube videos and in-person events like the AI and Games Conference.
You can subscribe to and support AI and Games right here, with weekly editions appearing in your inbox. If you’d like to sponsor this newsletter, please visit our sponsorship enquiries page.
Happy Wednesday, and welcome (back) to the
newsletter. In this week’s issue - which is written 100% by humans, meaning you will find spelling and grammar mistakes throughout - we dig into the public sentiment around games as their adoption of generative AI becomes apparent.From apathy to disgust, it gets all that much worse when you’re found to have done it, but did your best to keep it hidden.
Plus of course, all the regular announcements and other jazz that we do right here every week. Let’s get started!
Follow AI and Games on: BlueSky | YouTube | LinkedIn | TikTok | Twitter | Threads
Announcements
Alrighty, another week, another whole bunch of exciting stuff is happening on
!‘Alien: Isolation - The Retrospective’ is Live!
After months of working away on this project - the biggest video project we have ever worked on - it is now available on YouTube. Alien: Isolation - The Retrospective is a feature-length video clocking in just shy of 2-hours long where we dig into the making of Creative Assembly’s beloved horror game.
This video has already broken records for our most watched premiere on YouTube, and the highest watch count in its first 24-hours - a record previously set by another video on Alien: Isolation. A huge thank you to everyone who has watched it already, the response has been tremendous. Plus a big thank you to the studio who have supported us in publicising it online.
We’ll be making a written version of this project available on the website in the coming weeks. But given it is nearly 2 hours long, it’s taking us time to clear it up and give it structure in ways that will make sense - potentially releasing it in multiple parts. Plus we hope to make more of the interview footage available to our paying subscribers. More on this as and when it happens!
Meet Us Next Week at Develop!
Good grief, how is Develop less than a week away? Time flies etc. I’ll be on-site in Brighton next week. We have a lot going on, there’s the AI skills mixer in association with Mastered on Tuesday, and I’ve already got a batch of meetings booked in.
If you want to meet up and have a chat about how you can work with
, then click the link below!AI and Games is Headed to Gamescom!
Gamescom has been on my to-do list as one of the big games industry events I have yet to score off, and now it’s happening. I’m pleased to say I’ll be presenting at Devcom this year. We’ll have more to say on this after Develop, but also be sure to get in touch if you’re attending Gamescom and want to meet!
Back in Dublin for NEXUS 2025
Last year I had the pleasure of presenting at NEXUS, the Irish game developer’s summit hosted in beautiful Dublin. I had a great time enging with everyone at last years event, and I’ll be back once again in Dublin this October! Looking forward to it.
AI and Games Conference 2025 Submissions are Open Until August 1st
Didn't you hear? We're on the hunt for talks at the 2025 AI and Games Conference! Fellow board member and all-round lovely human Matthias Siemonsmeier gives his take on what we like to see in the submissions sent our way.
This guy loves what he does, and has such a great passion for making fun games, playing fun games, and celebrating the hard work of others. Matthias' 'audition' to join the team was essentially him walking up to me outside Moscone West at GDC last year and coming up to me to argue his feeling about how we *need* an event in Europe to cater to game AI professionals.
I think that's when I learned I just can't say no to this guy. 🤣
Don’t forget to submit your talks via this link before August 1st!
AI (and Games) in the News
There’s a bunch of news stories all consolidated into our big talking point of the week, but here’s some other big headlines that are worth paying attention to:
Meta ‘Wins’ Their First Major AI Copyright Case - On a Technicality
[The Guardian]
With the legal cases mounting up against (generative) AI companies, Meta (y’know Facebook, WhatsApp, Instagram, Quest headsets) has won a lawsuit against a number of high-profile authors who sued the company for using their works without permission to train AI systems. In this instance, the judge stated that the use of their books without permission would constitute as falling under ‘fair use’ of copyrighted assets which means there is no liability, because the defendants did not give sufficient evidence of the negative impact of this action.
What they’re saying here - per my understanding - isn’t so much that using these materials for generative AI is fair game. Rather, that the defendants did not show sufficient evidence of how this would affect the authors. The argument had been made that it would result in dilution of the market - of AI generated content that would impact the authors ability to make a living as writing professionals. Ultimately Meta won the lawsuit not because they’re in the right, but I would argue because the authors litigated too early, and failed to show the negative consequences of these actions.
The judge of this case, Vince Chhabria, stated as such in their ruling. Implying that this does not close the book on the issue (no pun intended) and that this is open to further litigation over time.
“This ruling does not stand for the proposition that Meta’s use of copyrighted materials to train its language models is lawful. It stands only for the proposition that these plaintiffs made the wrong arguments and failed to develop a record in support of the right one.”
Similarly, Anthropic Wins Their Copyright Fight, But It’s Not Over [Wired]
In a similar vein, AI company Anthropic also achieved a significant win in a lawsuit raised by authors. In this instance, the argument that training AI models using books by a handful of authors was fair use given it was considered transformative in nature, and not going to immediately result in the reproduction of the authors works.
However, it doesn’t mean Anthropic is out of hot water. Y’see while they trained their Claude large language model on these books, they didn’t pay for them and had relied on one of the numerous online libraries of pirated books for training purposes. The judge in this instance, William Alsup, noted that while the use of the text was transformative in copyright law, the fact that they stole the book by taking a copy from the pirated library and and kept a copy of it on their servers was grounds for further litigation. The company had later purchased a copy of each of these book in an effort to show that they had procured them legally. But the judge was not convinced.
As a result, the case will move to trial in December, where Anthropic will need to defend stealing 7 million books.Denmark Amending Copyright Law for Citizens to Own Their Bodily Features [Guardian]
Perhaps the most interesting story this week - outside of our main topic: the Danish government is going forward with a change in the law that gives citizens in effect copyright of their face, voice and body. This is largely to give individuals rights that allow them to defend against the use of their likeness in AI generated content. Meaning that in the long run Danish citizens will have the right to have “realistic, digitally generated imitations” of their features removed from online platforms and potentially sue for damages.
When Players Discover You’re Using Generative AI
For this week’s main story, I wanted to dig into a topic I’ve discussed before, but has most certainly reared its head in the past few weeks. The adoption of generative AI in games as part of the production, and more critically the end-product, is a contentious one. For every potential application where it supports creatives in achieving their goals, there is equally an opportunity for a studio to use it cut costs, but also people, in the process.
AI this past year has proven highly contentious as more and more examples of generative AI creep into society, and is met with a range of responses. From over enthusiasm to anger, with a bout of apathy in the middle. Games studios are not doing enough to show players what value generative AI brings to their games.
With studios exploring generative AI at scale, it’s created a scenario where players are increasingly sceptical of new projects. Evidence of generative AI artifacts can crush public sentiment, while others are facing backlash for no other reason that their art is mistaken for generative assets.
The Success of AI Lies in Public Perception
In the past year or so, I’ve spoken about how the success of generative AI technologies lies very much in the eyes of the consumer. Regardless of the amount of money being spent and the aspirations of big AI companies, if you can’t show the value of what you’re building using this technology - and that value does not align with the perspectives of your audience - then it’s not going to work. I think we’re seeing this in various business sectors given the influx of AI chatbots powered by large language models (LLMs), but equally we’re seeing it really hit hard with game development.
This was a point that I explored in a newsletter post last year, which in itself was follow-up to an interview I gave for Eurogamer, in which I talked about how historically ‘AI winters’ emerge as periods where developments slow down due to a lack of funding. This was historically largely driven by military and government investment, whereas now it’s mostly corporate. This combined with how ubiquitous some AI has become in society and commercial products means that now it’s less about whether the military industrial complex is interested in investing in AI, but whether consumers see value and are willing to pay for it.
A part of this is very much down to the rise in public awareness of what generative AI technologies can do, and also how they are increasingly treated as a shortcut to generate assets such as text, images, audio, and video - most of which being the bread and butter of the creative industries. The rise of AI tooling that makes it all the more easier to create these assets, and then share it online, and its questionable quality, combined with its ubiquitousness, has led to collective displeasure.
Just last week, Cambridge dictionary updated the meaning of ‘slop’ to refer to the use of the term for “content on the internet that is of very low quality, especially when it is created by artificial intelligence:”. This is significant, given it highlights how quickly this term has become normalised in the conversation about materials crafted using generative AI.
As a result, games studios need to be aware of the implications of using generative AI, and that they need to have a strong justification for why this technology was used in the way that it was and what value it is bringing. After all, the games industry already went through all of this with Web3 and NFTs - which had similar low-quality capitalist overtones - which resulted in nothing of value. So you have to be mindful of how easily this can turn off consumers at a time when games are also becoming more expensive, again.
For me this is something I’ve seen coming a mile away. Back at the London Developer Conference in April of 2024, I warned that this exact thing would happen. A backlash from consumers was inevitable, and it would be treated akin to the way that people buy organic groceries: a perception of increased value by avoiding the use of elements adopted by mass market products that erode quality. And of course it only took a few months before the likes of the ‘No Gen AI’ movement occurred, with Alex Kanaris-Sotiriou creating art assets that indies can use to reinforce this sign of quality.
The Industry is Aware of the Risks
Just last month I talked about how companies like Take Two and Electronic Arts are aware of “reputational risks” that adopting generative AI can present for a games company. The phrase came from a report to investors by Take Two as they discussed the implications of adopting AI technologies into their productions. Meanwhile Electronic Arts stated that it could present the use of generative AI "might present social and ethical issues."
This is to be expected when a company discusses the areas they’re exploring to investors and other senior business personnel. Risk is as always part of the job. But it’s worth highlighting how aware they are of the ethical issues and the public perception of using AI in game development. Again, to reiterate my point, unless players see the value of the technology, of it achieving something interesting to play, then the public perception is largely that it is being used to replace developers, and inject games with low-quality assets.
Now in my dealings with games studios I can tell you with certainty, that almost every company I interact with is at least looking at generative AI. This isn’t to say they’re going to use it, but they’re conducting due diligence on whether it’s of value to them, and how it could be used in meaningful ways. This can be either to put it into the game itself - which is often where big issues lie - or using it help support game production.
At our AI and Games Conference in 2024, I particularly enjoyed the presentation by Alessandro Sestini and Luca Ballore from EA on AgentMerge: a project to use LLMs for streamlining issue reporting and task tracking in the new Battlefield title by identifying repeating tickets/issues emerging from their automated bug testing workflow and consolidating them. It’s a good example of how to start using this technology to support production, but without putting generative AI into the game itself.
I think this is a fantastic example of how generative AI can be used in games. Plus from what I’ve heard, it proven quite popular within EAl with AgentMerge being rolled out into other productions at Electronic Arts given it has proven itself to be of value. It’s saving time, and allowing developers to do their job more efficiently, but it isn’t part of the game itself.
But what happens when you put it into the game?
When You Try It Out to Mixed Results
Since last year, Steam has now made it compulsory for games to include what AI content (i.e. use of generative AI tools, outputs, and models) is in a given product. Games on Steam now need to declare this, and the results are often quite mixed. In almost all cases, you’ll find it is AI generated assets - textures, images, dialogue, story etc. This is where the headaches for developers come in given there’s an invisible line you cross where that is not too much AI-generated content in a game. For some players, that number is 1, for others it’s a threshold where they lose interest. Games such as The Finals have largely came out from using AI-generated commentators unscathed. Meanwhile many an indie game with AI assets gets mixed to negative reviews, or are simply not being played.
Meanwhile earlier this year we talked about how inZoi - a game inspired by The Sims that used a lot of generative AI models - was not only the most-wishlisted game on Steam, but subsequently sold 1 million copies in its first week. This was a game that generated a lot of excitement specifically around how it uses modern generative AI as a gameplay feature. Plus it had a decent production budget to back itself up, and could argue it was a more prestige title that was exploring the use of generative AI in gameplay. Since then, the sentiment around the game has largely fell away, with the number of concurrent players on Steam sitting at around 1500 a day, a far cry from the 87,000 it had at launch (a drop of around 98% in less than 3 months). That’s not a good read, and it speaks to the game losing players interest very quickly given the quality of the game isn’t sufficient to carry their interest. By comparison, The Sims 4 - a game released back in 2014 - is happily running at around 30,000 concurrent players on Steam on a daily basis. Heck The Sims 3 has had better player counts at some points.
On that note, I still haven’t played it. My backlog is bad enough as it is…
When You Cover It Up and Get Caught
So yeah, there’s an increasing apathy towards use of generative AI as a means to add in-game assets, and even gameplay. While I think we’re still going to see more interesting or meaningful applications of the technology, more than anything it’s important that developers go out there and show they’re trying to figure that out, and they’re trying to do so above board ethically (and legally). Games like Retail Mage and Millennium Whisper are good examples of this where they’ve been public about what they’re trying to do and how they’re trying to do it. Meanwhile some of the more high-profile examples such as Dead Meat and 1001 Nights have both presented their work at events like GDC before release.
Meanwhile 11-Bit Studios are now learning the hard way what happens when you use it for in-game assets, and then try and keep it quiet. The Alters was released in June of this year and coming from the studio behind acclaimed strategy sim Frostpunk it had a good word of mouth, combined with a solid critical consensus, that led to it coming across my radar. The game itself, a survival strategy game about making clones of yourself to run a space station, is a real interesting concept and, having played it myself for a couple of hours, quite fun to play.
However all of that positive buzz online seems to be crashing to a halt as accusations of using generative AI to fill in some of the incidental text in the background monitors were later corroborated by the studio itself. 11-bit apologised, posting a statement on Twitter that reads as follows:
We’ve seen a wide range of accusations regarding the use of AI-generated content in The Alters, and we feel it’s important to clarify our approach and give you more context. AI-generated assets were used strictly as temporary WIPs during the development process and in a very limited manner. Our team has always prioritized meaningful, handcrafted storytelling as one of the foundations of our game.
During production, an AI-generated text for a graphic asset, which was meant as a piece of background texture, was used by one of our graphical designers as a placeholder. This was never intended to be part of the final release. Unfortunately, due to an internal oversight, this single placeholder text was mistakenly left in the game. We have since conducted a thorough review and confirmed that this was an isolated case, and the asset in question is being updated.
For transparency, we’ve included a screenshot to show how and where it appears in the game. While we do not want to downplay the situation, we also want to clearly show its limited impact on your gaming experience.
In addition to that, a few licensed movies that the alters can watch in the social area of the base were added at the final stage of development. While those were externally produced, our team was not involved in the creative process, and these required additional last-minute translations. Due to extreme time constraints, we chose not to involve our translation partners and had these videos localized using AI to have them ready on launch. It was always our intention to involve our trusted translation agencies after release as part of our localization hotfix, to ensure those texts would be handled with the same care and quality as the rest of the game. That process is now underway, and updated translations are being implemented.
To give you a better understanding of what a small part of the overall scope of the game’s narrative layer they are, those few external movies are approx. 10k words out of 3.4 million across all languages in the game, or just 0.3% of the overall text. The alternative was to release those specific dialogues in English only, which we believed would be a worse experience for non-English-speakers. In hindsight, we acknowledge this was the wrong call. Even more so, no matter what we decided, we should have simply let you know.
As AI tools evolve, they present new challenges and opportunities in game development. We’re actively adapting our internal processes to meet this reality. But above all, we remain committed to transparency in how we make our games. We appreciate your understanding and continued support as we work towards that goal.
Lorem Ipsum Dolor Sit Amet
As stated by the studio, the amount of text that was AI-generated was relatively minimal, and in the case of the in-game monitors, intended as incidental. Having been on this side of the fence as a game developer, I know sometimes that stuff we work on has gaps in quality that creep through. Nobody is perfect, and sometimes we will make mistakes that can result in bugs, broken assets, and otherwise sneaking into the publicly available builds. Meanwhile I have used many a horrid and terrible looking placeholder asset to help me get where I am going. In fact in some of my earliest games I would create ‘textures’ in Microsoft Paint because a) it was quick and b) I knew the art team would rip that out (with fervour) at the earliest of opportunities - sorry Matt!

However, to editorialise a little here, the statement from 11-bit speaks to a culture of using generative AI tools to facilitate aspects of the game without thinking of the broader repercussions. After all, if you look at that annotated screenshot above (which was included in the tweet by the studio), you can see how small it is - and credit to whoever figured this out online as that’s a level of scrutiny I am too old and tired to care for. But, why didn’t they just use lorem ipsum? After all, the whole point of using lorem ipsum text is to act as a placeholder, while also ensuring formatting and structure is sensible and aligned. You don’t need to use AI-generated text for what is an ultimately insignificant body of background material, but the studio had chosen to do that, or permitted/enabled that member of staff to do so.
Meanwhile the localisation of the in-game movies is a whole other matter, and that speaks to a degradation of quality control towards launch - and frankly if I had been in a decision making position at that time, I would have opted not to include the subtitles and made it clear at release, highlighted there was time constraints, and will be updated in the future. Given it’s clear that rushing this through is what has led to them being caught out.
Perhaps the most disappointing aspect of all of this, is that if they had wanted to add those bits of incidental material in the background that enrich the overall game, this is exactly the sort of thing that many junior developers and designers are given to work on to help them cut their teeth. So it’s removing an opportunity for people to build up their skills on the job - a recurring issue with the increasing adoption of AI tooling.
Failure to Disclose
The use of the placeholder text is one thing, but the other aspect that 11-bit Studios has caught justified criticism for, is that they failed to disclose the use of AI-generated text on the Steam page - like they should have done. At the time of publishing this issue, they have still not updated this disclosure, despite having since admitted in their social media post that they had intentionally used AI translations for the movie subtitles. Even if it’s temporary as they state, it should have been made clear, and their failure to do that is feeding the negative sentiment.
Players are Primed to Spot AI-Generated Content
For a portion of the gaming community, any AI-generated is considered a degradation of quality. It speaks to a reduction of standards, and an emphasis of expediency over value, and at a time when game prices are increasing, and the cost of living continues to sit still - even stagnate - for many this is a step too far.
The Alters isn’t the only game in recent weeks to face these criticisms. The recently announced Jurassic World Evolution 3 received significant backlash for the use of AI art in the scientist portraits shown in game footage - again, an incidental use case - and have since backtracked and made the decision to remove it entirely.
Meanwhile the developers of Wheelsprung, a title released as part of season 2 of the batch of games released for the crank-carrying Playdate console caused a furore when people realised that it was using code-generation tools like GitHub copilot as part of development. This was an interesting one given the sentiment has came from a feeling that the gameplay lacked in polish and quality, and now with the acknowledgement that they used AI to help them write the code, this is considered the reason for this degradation in value. Though the developers themselves have insisted the game was not ‘vibe coded’ - a ridiculous term used to describe the practice of creating software solely (or predominantly) through LLM prompts.
Plus Keywords Studios have been exploring their second attempt at adopting AI tools for making games. After we discussed Project Ava in the newsletter last year, they’ve since developed Project Kara where now the approach is to try and remaster an existing game, 2021’s mobile game Detonation Racing, through use of an AI-driven pipeline. This is an interesting idea in itself, of whether players perceive value in a game that is remastered predominantly through AI. While this has been on my topic to-do list for a while, Ed Nightingale wrote a really good piece on this over on Eurogamer.

Frankly, this - like many issues in games right now - is one the industry has created for itself. The Alters is but the latest in a long line of situations where players feel a sense of mistrust towards the products being sold to them. Some of this is somewhat justified, but equally a lot of it is artificially inflated by a media space largely dominated by influencers (um… hi!) who push agendas that suit their needs rather than maintaining any line of editorial integrity. Game development has suffered from these issues for a long time given how it frequently obfuscates the processes involved, and players seldom understand the reality of how games are made. Combine this with artificial intelligence - which is an equally poorly understood, misrepresented, and commercially exploited technology, that is also legally unrestrained at this time - and this is a recipe for disaster that has been years in the making.
NOTE: This is one of the reasons why I started AI and Games, because if there are two disciplines that permeate modern society that are horribly misunderstood it’s artificial intelligence, and game development. That’s why we do what we do!
Selective Outrage?
But equally for every game that is being raked over hot coals over their adoption of incidental assets, there are others that have adopted it that largely dodged criticism, or at least in one instance it was so early that players were largely unaware.

Critically-lauded Claire Obscure: Expedition 33 has had numerous individuals crop up on social media highlighting that they found evidence of AI-generated textures, which considering the false narrative that the game was made by 30 people would have turned into quite the little shitstorm. But this has largely dissipated, and it seems evidence of said AI textures is now removed and without any comment about it from the studio.
Meanwhile as I pointed out recently on BlueSky, the first game that actually used AI-generated assets for a remaster was the Mass Effect: Legendary Edition which came out in 2021. This was a process whereby the artists used AI for the first-pass, and then often cleaned them up by hand or simply redid it outright. I reported on this at the time it happened over on YouTube, and it’s using the first generation of what we now consider generative AI tools. But between a combination of lack of knowledge, a lack of understanding of what generative AI in games meant in 2021, and ultimately I feel an apathy for the subject given how much people love this re-release, has led to it being brushed under the rug.
But Does it Matter in the End?
I think a big question having over this for me, is how much the social media sentiment is going to affect the success of the games involved. After all, inZoi has sold very handsomely, currently the Steam reviews for The Alters are largely unaffected by the social media furore, and Claire Obscure: Expedition 33 is going to appear on many a GOTY list in 2025 - and it’s certainly a contender to appear in mine!
This isn’t to discredit the arguments being raised, far from it - and as I mentioned earlier the situation with The Alters is one of their own making, and could easily have been avoided. But is this aforementioned portion of the gaming community much smaller than the broader gaming masses? Is it down to a lack of understanding and communication? It’s hard to grasp at this time how big a deal it is for many a player, and ultimately whether they care.
But as Xalavier Nelson Jr of Strange Scaffold intimated above on BlueSky, it speaks to the problems that can arise - notably the erosion of trust, and the potential loss of sales - when adopting AI-generated assets in a production. Is the time ‘saved’ worth the risk? Yes it takes less time and money to create this content, but are you shooting yourself in the foot given down the line people will find out, and it could lead to your game being derided, ignored, or caught in a backlash so severe it will cost you even more money to undo what you did?
Personally I am not a fan of adopting it, and am of a similar mind to Xalavier in that the risk's of using it in a way that it appears in the game itself is one that a studio has to take into account. I am all for using AI to support production - given we’ve been doing this for 10+ years - but using what has already became a much-derided technological ‘innovation’ in a commercial product is a very risky proposition.
The Problem is Nuanced, and Seldom Simple
While I am in the camp that a studio should be very mindful of how AI-generated content is being integrated into projects, it doesn’t help that this poisoning of the well has led to many a false accusation of AI generated assets. The notion of slop has breached the public consciousness not just as a means to describe AI generated content, but equally it’s a means to criticise content either to declare it’s of a quality that would suggest it is AI-generated, or that people now assume works of a certain art style, made by humans, are in fact AI generated.
Indie studio KingStyle published Little Droid on Steam back in 2024, but the announcement of the PS5 port by publisher Stamina Zero led to backlash as the video got published on the official PlayStation YouTube channel, with accusations that the title art (shown above) was AI-generated. The developers have came out in defence of the project stating not just that this was a human-crafted piece of art, but is the work of a known artist in the games industry. Sadly I can see how people would misinterpret it as AI-generated given it evokes styles that are common of AI tools, but it speaks to a broader issue that is now affecting game developers of being accused of using AI even when they’re not.
You can read a good breakdown of the story by Nicole Carpenter over on The Guardian. Meanwhile the game is currently discounted with 65% off in the Steam Summer Sale!
Wrapping Up
This is far from the end of the story with how AI generated assets appear in games, but the number of accusations, scandals, and otherwise is brewing, and it could lead to an even more complex and divided landscape surrounding game development in the coming years. Video games are always a tough sell to the consumers, and the field has created problems of their own over the years in how it treats its audience. For many with the purchasing power, AI assets are but the latest ploy that ruffles their feathers.
It’s worth considering now whether embracing AI in that part of production is worth the short-term savings, if it could lead to long-term losses.