> "It is basically like having an incredibly smart engineer/scientists/philosopher/etc that can explain things quite well, but for pretty much every field. Does this "person" make mistakes?"
Does this person stubbornly insist on being right, even if they are wrong?
Likely, that's just how humans usually behave - and ChatGPT does the same.
At least that was my thinking when people complain about ChatGPT being a know-it-all bluffer.
> Does this person stubbornly insist on being right, even if they are wrong?
Actually ChatGPT will almost always change tune if you tell it that it is wrong. That's because it is a stochastic parrot and that's an "unlikely" event if you actually got the answer right. But we'll see if this gets updated/broken too.
Only if you correct a wrong answer. If you "correct" a right answer ("I don't think that's right") then it gives an even worse answer. So be careful when correcting.
Does this person stubbornly insist on being right, even if they are wrong?
Likely, that's just how humans usually behave - and ChatGPT does the same.
At least that was my thinking when people complain about ChatGPT being a know-it-all bluffer.