Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> and has the potential to debase shared reality.

If only.

What it actually has is the potential to debase the value of "AI." People will just eventually figure out that these tools are garbage and stop relying on them.

I consider that a positive outcome.



Every other source for information, including (or maybe especially) human experts can also make mistakes or hallucinate.

The reason ppl go to LLMs for medical advice is because real doctors actually fuck up each and everyday.

For clear, objective examples look up stories where surgeons leave things inside of patient bodies post op.

Here’s one, and there many like it.

https://abc13.com/amp/post/hospital-fined-after-surgeon-leav...


"A few extreme examples of bad fuck ups justify totally disregarding the medical profession."


Please don't use quotation marks to make it look like you're quoting someone when you aren't. That's an internet snark trope, and we're trying to avoid that kind of thing here.

You're welcome to make your substantive points thoughtfully, of course.

https://news.ycombinator.com/newsguidelines.html


Yup make up something I didn't say to take my argument to a logical extreme so you can feel smug.

"totally disregard"

yeah right, that's what I said


"Doing your own research" is back on the menu boys!


I'll insist the surgeon follows ChatGPTs plan for my operation next time I'm in theatre.

By the end of the year AI will be actually doing the surgery, when you look at the recent advancements in robotic hands, right bros?


People used to tell me the same about Wikipedia.


That it could "debase shared reality?"

Or that using it as a single source of truth was fraught with difficulties?

Has the latter condition actually changed?


That it's a garbage data source that could not be relied upon.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: