Highlights
- Shrine’s Legacy devs denied claims of using generative AI in their game.
- The studio refuted a review labeling their narrative as ChatGPT-written.
- This backlash highlights rising false accusations of AI use in indie gaming
The creators of the indie role-playing game Shrine’s Legacy have issued a firm rebuttal after players accused them of using generative artificial intelligence to build their game. Positive Concept Games, the studio behind the Super Nintendo-inspired title, released a statement on social media rejecting claims from a negative Steam review that labelled their work as "AI slop" and asserted the story was written by ChatGPT. The developers emphasized that their project was the result of a years-long development cycle involving only real human artists, highlighting a growing issue in the gaming industry where traditional craftsmanship is increasingly mistaken for machine-generated content.
The controversy ignited when a belligerent review on Steam described the game’s narrative as being "made in ChatGPT" and heavily criticized its quality. In response, the studio took to X to address the community directly, stating that they poured years of their lives into the project and worked exclusively with human talent for everything from writing to coding.
The developers clarified their stance on the technology, declaring that they do not endorse generative AI and will never use it. Despite this transparent response, the studio noted that they are now forced to spend their launch window defending the authenticity of their work rather than discussing the game itself.
Steam
April Fool’s Joke Misinterpreted
The situation was further complicated by online investigators scrutinizing the developer's past social media activity. Some users pointed to an old April Fool’s Day post where the studio experimented with an AI tool as "proof" of their claims. The developers immediately addressed this, explaining that the post was a one-off joke intended to mock AI trends and that the punchline was the use of AI itself, not an integration into their workflow.
However, this did not stop other reviews from picking apart perceived inconsistencies, such as slight abnormalities in character art or a "vibe" that the story felt artificial. One reviewer even admitted to purchasing the game not to play it, but solely to hunt for evidence of AI usage.
This incident highlights a wave of anxiety and mistrust within the PC gaming community that has made it difficult to distinguish between human and machine creation. While Valve introduced strict rules in 2024 requiring developers to disclose AI content, and nearly 8,000 games on Steam now do so, players have become hyper-vigilant.
A 2024 Microsoft study found that users can correctly identify AI-generated content only about 62% of the time, suggesting that "gut feelings" about art or dialogue are frequently wrong. For small indie teams like Positive Concept Games, these false positives can be devastating, as baseless accusations can harm sales and reputation in an industry where proving a negative is increasingly difficult.

