Comment on Uses for local AI?

thirdBreakfast@lemmy.world ⁨1⁩ ⁨month⁩ ago

I use the Continue VS Code plugin with Ollama to use a couple of different models (deepseek-coder-v2 & starcoder2) to recreate a local only Github Copilot type experience for coding. This is on an M1 Apple Silicon though. For autocomplete the generation needs to be pretty brisk - I’m not sure how that would go in a VM without a GPU.

source
Sort:hotnewtop