> (Github Copilot allows selecting different models, but I didn't check more carefully whether that also includes a local one, anyone knows?).
To my knowledge, it doesn't.
On Emacs there's gptel which integrates quiet nicely different LLM inside Emacs, including a local Ollama.
> gptel is a simple Large Language Model chat client for Emacs, with support for multiple models and backends. It works in the spirit of Emacs, available at any time and uniformly in any buffer.
To my knowledge, it doesn't.
On Emacs there's gptel which integrates quiet nicely different LLM inside Emacs, including a local Ollama.
> gptel is a simple Large Language Model chat client for Emacs, with support for multiple models and backends. It works in the spirit of Emacs, available at any time and uniformly in any buffer.
https://github.com/karthink/gptel