https://github.com/milla-jovovich/mempalace
- anyone tested this?- no promotion, I have no connection to this, I just saw it Every conversation you have with an AI — every decision, every debugging session, every architecture debate — disappears when the session ends. Six months of work, gone. You start over every time. Other memory systems try to fix this by letting AI decide what’s worth remembering. It extracts “user prefers Postgres” and throws away the conversation where you explained why . MemPalace takes a different approach: store everything, then make it findable. The Palace — Ancient Greek orators memorized entire speeches by placing ideas in rooms of an imaginary building. Walk through the building, find the idea. MemPalace applies the same principle to AI memory: your conversations are organized into wings (people and projects), halls (types of memory), and rooms (specific ideas). No AI decides what matters — you keep every word, and the structure makes it searchable. That structure alone improves retrieval by 34%. AAAK — A lossless shorthand dialect designed for AI agents. Not meant to be read by humans — meant to be read by your AI, fast. 30x compression, zero information loss. Your AI loads months of context in ~120 tokens. And because AAAK is just structured text with a universal grammar, it works with any model that reads text — Claude, GPT, Gemini, Llama, Mistral. No decoder, no fine-tuning, no cloud API required. Run it against a local model and your entire memory stack stays offline. Nothing else like it exists. Local, open, adaptable — MemPalace runs entirely on your machine, on any data you have locally, without using any external API or services. It has been tested on conversations — but it can be adapted for different types of datastores. This is why we’re open-sourcing it. submitted by /u/weltscheisse
Originally posted by u/weltscheisse on r/ClaudeCode
You must log in or # to comment.
