Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The reason I think you're a LLM is that you somehow believe that knowing a distance between a place and itself is meaningful. It's still the case.

> The model needs to reply with something and here it tells the user the assumptions it made.

No, it didn't "tell" its assumption (where did it output "I assume you mean you want to know the distance from US city to itself"?). It lacks the awareness to even know the assumption is made let alone that it's nonsensical. But it's clear from its responses what the assumption was: that you want to know the distance between a city and itself.

In your second prompt you provided the context (a list of exclusively US cities). With context established, as I said in the first place, it's no longer pure garbage in.

Honestly you don't need to copy-paste more of that stuff, it does nothing to support your counter-argument against mine that if you give it garbage, it will respond with garbage.



>No, it didn't "tell" its assumption (where did it output "I assume you mean you want to know the distance from US city to itself"?)

Just reread what it told me; it told me it is zero miles away from itself but it was a nonsensical query and that one should be more specific if the cities being compared are different.

> But it's clear from its responses what the assumption was: that you want to know the distance between a city and itself.

If you ask for the distance between just Boston and London, it assumes Boston, MA and London, UK just like Google does. Neither asks if I want the distance between London, UK and Boston, England. Same thing for Portland and Boston, even though Portland, Maine is much closer to Boston, MA. I think both ChatGPT and Google just assume the largest cities.

Not sure if you got my point about the list. In that situation I'd find the zero miles to be useful because otherwise I'd need to remember to exclude some entries from the list and add back those records doing the trivial transformation afterward.

Another way to think of it is if I wrote a function for you that asks if the number n is divisible by 17, wouldn't you want it to handle the trivial case of 'Is 17 divisible by 17?' over a CSV rather than excluding the trivial examples and handling them manually? Trivial case handling is important for LLMs. Is also useful if you're trying to test its logical consistency.

>The reason I think you're a LLM is that you somehow believe that knowing a distance between a place and itself is meaningful. It's still the case.

Based on my posting history (13 years) and username (real name), chances are pretty good I'm human. I mentioned my reasons; sorry I couldn't communicate it clearly enough. Very likely I will disengage from this thread now. Also, don't appreciate being called a bot; think people doing that makes Hacker News worse.


> Just reread what it told me; it told me it is zero miles away from itself but it was a nonsensical query and that one should be more specific if the cities being compared are different.

It made the least sane assumption. If it was aware of that, it would have used a more reasonable one (eg. two cities).




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: