What if your AI chatbot could host a full-fledged video game — not as a link, but right there in the conversation?
DOOM in ChatGPT and Claude. It’s happened. Chris Nager’s April 2026 experiment ports the 1993 id Software shooter directly into these AI clients using Anthropic’s Model Context Protocol (MCP). Forget calculators or fridges; this is code running inside language models, a geek milestone that spotlights MCP apps’ potential.
And here’s the thrill: it’s no parlor trick. MCP apps deliver interactive UIs rendered inline via iframes, pushing beyond text fetches or image returns. Developers in LATAM — or anywhere — building agents with Claude, ChatGPT, or Cursor now see architectural choices crystalized. This isn’t hype; it’s a blueprint for embedding rich experiences in AI flows.
How Does DOOM Run in ChatGPT and Claude?
MCP, launched by Anthropic in November 2024, acts like USB-C for LLMs. Clients such as Claude Desktop, Zed, or ChatGPT connect uniformly to external tools, data, apps — no custom hacks required. Servers become plug-and-play.
Early MCP servers spit out text or tool calls. MCP apps evolve that: they paint custom UIs inside the host’s iframe. Nager’s setup use this for DOOM, powered by cloudflare/doom-wasm — the original engine compiled to WebAssembly, paired with Freedoom Phase 1 for license-free redistribution.
The architecture? Stripped to essentials after iterations. Three core pieces:
create_doom_session: Kicks off an inline MCP app session in supporting clients.get_doom_launch_url: Falls back to a plain URL for others./doom/playroute: Runs the game via a signed token in the URL.
That token — a JWT — is genius. It launches DOOM without server pings for state, decoupling backend from runtime. The browser, or Claude’s iframe, handles execution solo.
Nager shared this gem:
El detalle más elegante es el token firmado: la URL de lanzamiento contiene un token criptográfico que basta para arrancar el juego, sin que el navegador tenga que hacer round-trips al servidor para mantener estado.
Why Does This Matter for AI Developers?
The real battle wasn’t WebAssembly compilation — that’s old news. It was taming diverse client security policies. Hosts like Claude web, ChatGPT, or Codex enforce varying rules on nested iframes, CSP, frame-src.
Initial attempts nested an iframe inside the MCP app iframe for /doom/play. Worked sometimes, crashed others. Fix? Ditch the wrapper. Mount DOOM’s canvas directly in the host’s iframe. No nesting, no frame-src woes, no navigation assumptions.
Here’s the flow:
flowchart LR
U["Usuario en Claude/ChatGPT"] --> H["Host MCP"]
H -->|llama tool| S["Servidor MCP"]
S -->|firma token| T["Token JWT"]
T --> A["MCP App iframe"]
A -->|carga WAD| W["DOOM WASM"]
W --> C["Canvas inline"]
Classic lesson: fragile layers? Strip ‘em.
This echoes the browser wars of the late ’90s — when JavaScript applets promised embedded apps, but security silos killed momentum. MCP sidesteps that, standardizing UI across clients. Bold prediction: by 2028, expect MCP apps powering collaborative IDEs inside ChatGPT, not just games.
But skepticism creeps in. Anthropic pitches MCP as open, yet host fragmentation persists — not every client supports apps yet. Nager’s austere version shines by dodging session persistence and screenshots, but scaling to real apps? That demands more.
DOOM inside Claude’s web interface. Proof the ceiling soars.
MCP began exposing text tools to models. Apps extend to full UI. Running DOOM isn’t the goal; it’s evidence of vast headroom.
For LATAM devs eyeing AI agents, prioritize stateless designs like signed tokens. Test across hosts early — ChatGPT’s iframe leniency differs from Claude Desktop’s strictness. And lean on WebAssembly for compute-heavy payloads; it future-proofs against client upgrades.
The Bigger Shift: AI Clients as Platforms
This isn’t just DOOM. It’s harbinger of AI interfaces evolving into runtime environments. Imagine MCP apps for live data viz, 3D modelers, or even multiplayer sims — all inline, model-aware.
Unique insight: like HTTP standardized web UIs in 1995, MCP could unify AI tooling, birthing an app ecosystem rivaling mobile stores. Yet corporate spin alert — Anthropic’s “USB-C” analogy glosses over adoption hurdles. True universality needs rivals like OpenAI fully on board.
Energy surges here. Platforms shift when boundaries blur — LLMs hosting games signal code execution woven into conversation. Developers, grab MCP docs; the next wave awaits.
Tip for MCP app builders: Shun nested iframes. Embed directly.
The experiment demands attention. It illuminates paths forward, warts and all.
🧬 Related Insights
- Read more: .NET Logging’s Zero-Alloc Hero: Source Gen Crushes the Competition
- Read more: Auth0 Symfony SDK’s Weak Cookies Enable Account Takeovers
Frequently Asked Questions
What is the Model Context Protocol (MCP)?
Anthropic’s open standard for LLMs to connect uniformly with external tools, data, and UIs — think USB-C for AI clients.
Can I run DOOM in ChatGPT right now?
Yes, via Nager’s public MCP server — if your client supports MCP apps. Fallback URL works universally.
Will MCP apps replace traditional web apps in AI?
Not soon, but they’ll embed powerful UIs inline, transforming chats into interactive platforms.