I try to stay at the cutting edge of everything AI, especially when it comes to LLM-enabled development. I've tried GitHub Copilot, Supermaven, a

Replacing GitHub Copilot with Local LLMs

submited by
Style Pass
2024-10-11 17:30:06

I try to stay at the cutting edge of everything AI, especially when it comes to LLM-enabled development. I've tried GitHub Copilot, Supermaven, and many other AI code completion tools. However, earlier this week I gave a try to locally hosted LLMs and I am not coming back.

What about LM Studio? I saw a few posts debate one over the other. LM Studio has intuitive UI; Ollama does not. However, my research led me to belief that Ollama is faster than LM Studio.

I was hesitant to adopt local LLMs because services like GitHub Copilot "just work." However, as I've been traveling the world, I found myself often regretting having to depend on Internet connection for my auto completions. In that sense, switching to a local model has been a huge win for me. If Internet connectivity was not issue, I think services like Supermaven are still very appealing and worth the cost.

If you are not familiar with Supermaven and if you are Okay with depending on Internet connection, then it's worth checking out. Compared to GitHub Copilot, I found Supermaven's auto completion to be much more reliable and much faster.

Leave a Comment