Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Can it be run fully locally?


yes! You can run the local version here in your bash https://docs.vocode.dev/python-quickstart#self-hosted


I think this used to mean can it be run offline and right now (usually) whenever there is an LLM involved the answer is soundly no


Ah! Right now our default is set to use OpenAI... but you can actually use local LLMs by creating a custom agent. We're going to add a full stack of local STT/TTS/LLM... just haven't had time for it yet!

If anyone wants to help with it we're totally open for contributions :)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: