Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Yep - and the hardest part about getting the input signs and symptoms data is that patients often don’t know the words to describe what they’re feeling.

LLMs are not going to help if the input data is also garbage, let alone hallucinations.

“Are you dizzy?” - sometimes, I’m not sure… etc



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: