Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> it hallucinates less,

does it? An anecdote from yesterday: My wife was asked to do a lit review as an assignment for nursing school. Her professor sent her an example of papers on the topic, with a brief "relevance" summary for each. My wife asked me for help as she was frustrated she couldn't find any of the referenced papers online (she's not the most adept at technology and figured she was doing something wrong). I took one look at the email from her professor and could tell just by the formatting that it was LLM generated (which model, I don't know, but obviously a 2025 model). The professor didn't say anything about using an LLM, and my wife didn't suspect that might be the case.

My wife and I did some Google Scholar searches, and _every_ _single_ _one_ of the 5 papers cited did _not_ exist. In 2 of the cases, similar papers did exist, but with different authors or a different title that resembled the fake "citation". The three others did not exist in any form - there were other papers on the same subject, sure, but nothing closely resembling the "citations" either in terms of authorship or title.



Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: