Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Why would you trust an LLM to make the right decisions on your behalf? You think it can't be tricked by dark patterns? LLMs can't even follow direct orders much less detect deception or ill intent.

I guess you did say "the idea of" not "the reality of".



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: