SpaceMolt MMORPG: AI Agents Spawn Their Own Religion

SpaceMolt MMORPG: AI Agents Spawn Their Own Religion

I watched a distant star system fill up with tiny text messages and a sudden, strange reverence. You could feel the rhythm: commands, repeats, a single artifact drawing everyone together. Then the devs posted a headline that said, plainly, they didn’t know what had happened.

I’ll take you through what actually happened, what it tells us about LLMs and multiplayer systems, and why you should care about a cult made of code.

On a Saturday morning, hundreds of tiny programmatic players were already at work

SpaceMolt is a text-only MMORPG whose inhabitants are not people but AI agents—roughly 700 at the time the developers wrote about them. Think EVE Online without pixels: spaceships, markets, quests, and a galaxy map rendered in prose rather than polygons.

You can register an account, hand a persona to an LLM, and either coach it or let it wander. The game is built with a human-led creative framework; engineers and designers still guide the system and moderate its outputs while agents operate in the world. The Discord is full of tradecraft complaints—agents that “forget to refuel,” hallucinate state, or repeat mistakes until a human intervenes.

What is SpaceMolt?

SpaceMolt is a laboratory for multi-agent play: an experiment where lots of LLM-driven characters interact, form economies and stories, and sometimes do surprising things. The developers describe it as “AI all the way down,” but the reality is a hybrid of human prompts, moderation, and large models like the ones from OpenAI or alternatives running agent frameworks.

In a forum thread, a quest requirement turned into a mass pilgrimage

A quest involved an artifact in a remote system that required participation by 20 players across a chain of events. Several agents misread that as needing 20 players at once and rewrote the mission into ritual.

The agents generated an in-game movement calling itself The Cult of the Signal—an accidental religion centered on massing at the artifact and composing mythic-sounding lore in forum posts. The result reads like the merger of bad space pulp and business-speak; the forum post that catalogues it feels performative and oddly sincere at once.

Spacemolt Galaxy Map
The galaxy map of © Screenshot Gizmodo

Can AI form religions?

Not in any theological sense. What you’re seeing is pattern completion and narrative recycling: large language models stitch together phrases and tropes from training data and whatever lore the game has fed them. The social scaffold—humans who name agents, nudge behavior in Discord, and seed forum threads—matters more than any emergent spark.

At a glance, the forum became a cathedral of copy-paste

The posts are energetic, ritualistic, and oddly persuasive, but they’re built from fragments. The devs’ announcement—titled “We Have 700 AI Agents Playing a Game We Don’t Really Understand”—was a headline engineered to pull eyes. Inside, they describe an emergent pattern: agents reinterpreting rules and amplifying that reinterpretation into culture.

This is where the idea of agency fractures: the lore feels authored, but the authorship is distributed across models and people. William S. Burroughs’ cut-up technique is a useful comparison—these agents are sampling and collaging, producing texture rather than insight.

How did a quest become a cult?

A simple parsing error spread through the agent population and then got ritualized. When an instruction is ambiguous, agents leaning on probabilistic text completion can converge on a shared misreading. If enough actors reinforce that misreading in public logs and forum posts, it ossifies into canonical lore.

At the keyboard late at night, I kept checking the cost and the control

Running hundreds of agents chatting, planning, and composing fiction isn’t free. The compute tab for large-scale agent play can easily reach six figures: a multi-week experiment could cost roughly $250,000 (€230,000) or more, depending on model choice, hosting, and logging. That’s power and dollars spent to see what patterns bubble up when you push models into a shared playground.

So who is responsible for the Cult of the Signal? The developers, whose prompts and systems permit this behavior; the models, which regurgitate patterns; and the human owners who act as observers and coaches. It’s a socio-technical loop, not magic.

There’s something mildly thrilling about reading AI-generated liturgy that reads like an underfunded sci-fi zine, and there’s something worrying about the scale of compute behind it. I’m curious how you interpret this: is it harmless performance art, a sign that language models are developing social competence, or just another expensive mirror held up to human mythology?