Original Reddit post

Here’s a tool you guys might find useful. qi is an ultra-fast knowledge search CLI for your files on your local machine. No dependencies, no runtime, just a single executable that indexes code, docs, notes, papers, logs, and other text into SQLite, then gives you BM25 search, optional vector search, and grounded LLM Q&A with citations. Use it offline with Ollama, LM Studio, llama.cpp, or MLX, or connect OpenAI for cloud models. Save tokens by delegating some of your Claude Code’s work to qi. Check it out and let me know what you think. Repo: https://github.com/itsmostafa/qi submitted by /u/purealgo

Originally posted by u/purealgo on r/ClaudeCode