Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

You can still run your local cloud and AI providers will be heavily consolidated to a few.

While for programming task I do use Claude currently, local models can be tuned to serve 80% of the time reduction you win by using AI. Depends a bit on the work you do. This will improve probably, while frontier models seem to hit hard ceilings.

Where I would disagree is that joining concepts or knowledge works at all with current AI. It works decently bad in my opinion. Even the logical and mathematical improvements of the latest Gemini model don't impress too much yet.



Local models are fine for the way we have been using AI, like as a chatbot, or a fancy autocomplete. But everyone is craming AI into everything. Windows will be an agentic OS whether we like it or not. There will be no using your own local model for that use case. It is looking like everything is moving that way.


Hmmm, maybe use a different OS? I would never dream of using Windows to get any type of work done myself and there are many others like me. There certainly are choices. If you prefer to stay, MCP services can be configured to use local models, and people are doing so on Windows as well (and definitely with MacOS and Linux). From an OS instrumentation perspective, I think MacOS is probably the most mature -- Apple has acknowledged MCP and intends a hybrid approach defaulting to their own in house, on device, models, but by embracing MCP appears to be allowing local model access.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: