Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

From the description it seems even the larger 120b model can run decently on a 64GB+ (Arm) Macbook? Anyone tried already?

> Best with ≥60GB VRAM or unified memory

https://cookbook.openai.com/articles/gpt-oss/run-locally-oll...



A 64GB MacBook would be a tight fit, if it works.

There's a limit to how much RAM can be assigned to video, and you'd be constrained on what you can use while doing inference.

Maybe there will be lower quants which use less memory, but you'd be much better served with 96+GB




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: