New release of Memora , the open-source MCP memory server that gives AI agents persistent memory across sessions. What’s new in v0.2.21: Chat Panel — A RAG-powered chat built into the knowledge graph UI. Ask questions about your stored memories, get streaming LLM responses with cited sources, and click any [Memory #ID] to highlight that node and its connections in the graph. Hidden by default, toggles from a floating icon at the bottom-right. Default chat model — Configurable via CHAT_MODEL env var. Other improvements:
- Pagination for timeline memory list
- Consolidated frontend (single source of truth for local + cloud)
- Favorite star toggle with filtering
- Action history with grouped timeline view
- Memory insights with LLM-powered pattern analysis
- Better exception logging and hierarchy module extraction Works on both the local Python server and the Cloudflare Pages deployment. GitHub: github.com/agentic-mcp-tools/memora submitted by /u/spokv
Originally posted by u/spokv on r/ClaudeCode
You must log in or # to comment.
