Running Local LLMs Inside Cursor, Opencode, and Crush
I wanted the best of both worlds: local models for privacy and cost, plus the scaffolding that makes modern tools actually pleasant to use. The plan was straightforward: LM Studio running the model, LiteLLM as the OpenAI‑compatible face, and three clients in the mix: Cursor, Opencode, and Crush. No spend, no drama, just enough configuration… Read More »