I’m not as tech savvy as a lot of you, but I want to know if it’s more cost effective and better to utilise my hardware for local ai? And how would I go about it and is it worth my time? I’m currently coding with Claude Code Max for several Godot projects and three.js html projects, and it has been a treat. I experimented last year with ollama and lm studio but had poor results. Should I stick with Claude Code max or look at locally ran AI? If local how would that work and what do I need to do for it to function like Claude Code does? Thanks in advance. submitted by /u/XZettoX
Originally posted by u/XZettoX on r/ClaudeCode

Even without a 4090, you can use local models with Claude Code. Basically anything with more than 8GB VRAM can work.
Is it more cost effective? Maybe. Maybe not. Detailed discussion and instructions are here: https://jonathansblog.co.uk/using-claude-code-with-local-llm-models-the-complete-guide