Original Reddit post

Hey folks. Currently working on a contract that stipulates that I cannot share any of the customer’s code with any commercial AI software. I’ve spent a few days trying to come up with something completely local, which has had mixed results. Ollama to start. A few models (Gemma 4 26b, Gemma 4 e4b, and Qwen 2.5), and tried using both pi.dev and Continue.dev (the latter as a VSCode extension). The big issue is… the codebase is fairly complex. Both pi and Continue.dev, using various models, continue to put themselves into infinite loops searching files. Are there any recommendations for local-only setups to handle large/complex codebases? Work machine is an M4 MBP with 24GB RAM. Thanks in advance! submitted by /u/charliegriefer

Originally posted by u/charliegriefer on r/ArtificialInteligence