I am working on an app to make it even easier to run Local LLMs and support for multiple chats, RAG, and STT. I did it mostly for learning about different tasks that’s possible using local LLMs and specifically for my wife who was working overwhelmed with those things (and for some reason was overwhelmed setting up Ollama. Tech stack is Electron + NuxtJS, currently only for Mac but I have already started tinkering with Windows support.
https://msty.app