☁️ Cloud & Infrastructure

Karpathy's LLM Wiki Nails It – But Local Setup's Friction Killed It for Me

What if your AI notes actually remembered everything, without rehashing docs every query? Karpathy's LLM Wiki does that – I built Hjarni to fix its biggest pains.

Andrej Karpathy's LLM Wiki setup in Obsidian with Claude Code integration

⚡ Key Takeaways

  • Karpathy's LLM Wiki fixes RAG's rediscovery waste with persistent, LLM-maintained markdown knowledge. 𝕏
  • Local setups suffer friction: single-machine, single-client, hard sharing – killing habits. 𝕏
  • Hjarni hosts the pattern over MCP for smoothly multi-device, multi-LLM access, trading git for ubiquity. 𝕏
Published by

theAIcatchup

Ship faster. Build smarter.

Worth sharing?

Get the best Developer Tools stories of the week in your inbox — no noise, no spam.

Originally reported by dev.to

Stay in the loop

The week's most important stories from theAIcatchup, delivered once a week.