Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> It is basically like having an incredibly smart engineer/scientists/philosopher/etc that can explain things quite well, but for pretty much every field. Does this "person" make mistakes? Can't cite their sources? Yeah this definitely happens (especially the sources thing), but when you're trying to understand something new and complex and you can't get the "gist" of it, ChatGPT does surprisingly well.

if you don't "understand something new", how would you know to trust what it says or not? the usefulness or not has to be based on something, not feelings.



That goes for pretty much any resource; there's a lot of botched or sub-optimal explanations of complex things out on the internet. But being able to have a conversation with someone who has a decent conceptual grasp on something is better than a perfect encyclopedic article alone, and ChatGPT is able to roughly do that for a lot of subjects.

As for how you know it's accurate, you don't really have to know it's accurate most of the time, because if you're using it to try to learn something new it's probably going to become apparent quickly enough. Nobody is reading something and then going "Welp, now I'm done learning."

As a software example, you can ask it about implementing the Fourier transform. If it says something wrong, you'll find out.

The question then becomes how often it is inaccurate and how badly. But in that regard, as long as you don't treat it as an all-knowing oracle, it's at least as useful as having a friend that knows a lot of stuff you want to learn.


> That goes for pretty much any resource; there's a lot of botched or sub-optimal explanations of complex things out on the internet. But being able to have a conversation with someone who has a decent conceptual grasp on something is better than a perfect encyclopedic article alone, and ChatGPT is able to roughly do that for a lot of subjects.

people keep saying this, but it isn't true. if you ignore sites like this one and reddit there are plenty of authoritative articles and explanations about things with provenance. and if they are wrong they'll update their information.


Clearly you are of the opinion that ChatGPT is useless, there are superior resources already available for any topic, and it's all just hype.

Well, then we don't need to argue this since the problem will elegantly solve itself if that's true.

I disagree though. For a lot of things it feels like I can get much better answers than Google, especially when it comes to somewhat conceptual questions.

(Also, I don't use news aggregator comments to learn things unless it's the only possible source. But if you think every blog post or YouTube video that got things wrong has a detailed errata, you'd be sorely wrong. It's so uncommon on YouTube that the practice is usually commended when noticed.)


> Clearly you are of the opinion that ChatGPT is useless, there are superior resources already available for any topic, and it's all just hype. Well, then we don't need to argue this since the problem will elegantly solve itself if that's true.

I disagree though. For a lot of things it feels like I can get much better answers than Google, especially when it comes to somewhat conceptual questions.

(Also, I don't use news aggregator comments to learn things unless it's the only possible source. But if you think every blog post or YouTube video that got things wrong has a detailed errata, you'd be sorely wrong. It's so uncommon on YouTube that the practice is usually commended when noticed.)

---

I never said ChatGPT is useless, lol. it's truly amazing how people can be so bad at reading comprehension yet praise AI bots in the same post.


Ironically, the direct quote you just copied and pasted actually shows that I never claimed you explicitly said that at all. So much for reading comprehension :P

But in all seriousness, I just simply didn't have much to go off of, and extrapolated based on context in the thread. You could always elaborate on what your opinions are.


By having a conversation with it. Ask probing questions. Verify later with wikipedia.

For instance I asked it about space time metrics, and I learned about the varying types and in what circumstances each are used. Ask it about their mathematical properties and it gives a succinct explanation contrasting the different approaches. Certainly you can see the value now of conversational style pedagogy.

Think of it like learning about things from mates at the bar. In the thick of it there may be bits of inaccuracy, but you'll certainly know more about a topic than before. Then you can verify bits of things later in papers or books, etc.


seems like a waste of time if you're going to have to verify with Wikipedia anyway. personally I'd prefer to read wikipedia first.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: