Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I do not grasp the spite put forth whenever a LLM is used for a non-critical point in an argument.

1) LLMs have the potential to, on the whole, raise the base "correctness" of people's opinions. (Ex: asking Gemini flash why seed oils are unhealthy begines with "that's a highly debated and complex topic." Which, is infinitely better than believing some short form shopping channel)

2) They offer a softer emotional impact when it's inevitably corrected, making for a less toxic environment, and increasing the odds a topic will be discussed and possibly an opinion corrected.

3) more often than not, I've found them to not only be correct- but they'll offer nuance in the answer. Ex: mitochondria aren't the same size across everything that has them.

We should be pointing out flawed usage, not a wholesale assault on all usage.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: