Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Given the topic I'm unfortunately not comfortable sharing the details in a public space like this, but the answer that it gave was not just a misinterpretation of the question, it was actually entirely wrong on the merits of its own interpretation.

And even if it were a misinterpretation the result is still largely the same: if you don't know how to ask good questions you won't get good answers, which makes it dangerous to rely on the tools for things that you're not already an expert in. This is in contrast to all to people who claim to be using them for learning about important concepts (including lots of people who claim to be using them as financial advisors!).



If you don't know how to ask a human doctor a good question you can't expect to get a good answer either.

The difference is that a human doctor probably has a lot of context about you and the situation you're in, so that they probably guess what your intention behind the question is, and adjust their answer appropriately. When you talk to an LLM, it has none of that context. So the comparison isn't really fair.

Has your mom ever asked you a computer question? Half of the time the question makes no sense and explaining to her why would take hours, and then she still wouldn't get it. So the best you can do is guess what she wants based on the context you have.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: