Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

You don’t need general programmability for AI inference.


The money's in the training, not the inference.

If you look at Apple and Google, they already have their own hardware for inference in their smartphones. They don't need NVidia for that.



Hmmm, that's worse for NVidia.


NVIDIA owns the interconnects that are used for this training. I’m sure they have their own competing AI accelerator they are working on too.


You don’t need programmability for AI teaining either.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: