Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> framework-agnostic, drop-in chat solution

Maybe I'm being dumb but is this a generic chat UI for openai models only? Pretty bearish on adoption of this if so. As a pragmatic dev I'd definitely not be keen to bake model lock-in into my UI for functionality as generic as chat.



The openai library works with most other providers by just changing the endpoint url, and this is Apache licensed, so I feel good about using it.


Is there a place to change the endpoint url? It seems we just add the workflow id and generates a secret which is used by the frontend. Apache license is good though


Note that their basic example in the readme starts with `OpenAI(api_key=os.environ["OPENAI_API_KEY"])`

Generally speaking, in most regular usages, you can replace it with an alternative provider like Openrouter, with `OpenAI( base_url="https://openrouter.ai/api/v1", api_key="<OPENROUTER_API_KEY>")`

But I haven't tested chatkit yet and don't know if it might be using special endpoints that are currently only supported by OpenAI. IANAL, but would assume though that if the client is Apache licensed, then it wouldn't be an issue for Openrouter and others AI providers to develop their own versions of those endpoints.

Maybe I'm overly optimistic, but based on what I'm seeing with MCP and other recent developments, the industry is continuously gravitating towards a commoditized/interchangeable future where no provider has a structural moat.


Looks like it supports custom backends




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: