of VSCode plugins for local code generation - Continue or Twinny
Options:
🤖 Continue: everything is only through chat, no code autocompletion
🦙 Llama Coder: code autocompletion, no chat interface
🔍 Cody from Sourcegraph: pricing model is unclear
👯♂️ Twinny: a new project that 🤝 combines the features of Llama Coder and Continue - chat and code autocompletion.
_For autocompletion, so that it does not hang, of course, you need to take a smaller "base" model (1-3B) and a more powerful computer._For the chat to work, you also need to run the "instruct" model.