Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> You need at least 10 trillion tokens to make an 8 billion parameter model

Are you sure it is not just very inefficient?



I'm not an AI expert, but it seems to me that the common consensus is that current LLMs are quite inefficient and that there's room for improvement.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: