Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Your emotions for the AI are real, but the AI's emotions for you aren't.


why is electricity in your brain real? and fake in the AI?


The electricity in both things are real, and it's unkind to twist the words of the person you responded to that way. They specifically mentioned emotions, not electricity. An AI will be completely unaffected by anything said to it.


I think it's a legitimate question, because ultimately all brain activity is electrical and chemical signals. To say that some electrical signal objectively is or is not an emotion implies that there is some objective rule for deciding this -- but I'm not aware of any such rule, only longstanding conventions.


AI isn’t programmed to have emotions. Merely to replicate a semblance of a simulacrum of said sensations. Regardless of your considerations for the electrical signals, the models are just tab-completion, ad infinitum.


Your emotions are just a tab-completion to God/Creator/or whatever.


No, it completely misses the point. If you say something very upsetting to me it will genuinely affect me and my day and have negative consequences for myself and the people around me, because I will have an emotional reaction. You can't upset an AI because it doesn't have the capacity to be upset, it can only return some words and then continue on as if nothing happened.

I hope that makes sense. The underlying functionality of my emotions don't matter at all, only the impact.


> it doesn't have the capacity to be upset

AIs are affected by things you say to them while those things are in their context window. Your context window for bad news is about a day.

Why are you certain that you -- a physical system amounting to a set of complex electric and chemical signals -- have this capacity for "genuine emotion", while another physical system, that outwardly behaves much the same as you, does not?

If I made a replica of your body accurate down to the atomic scale, and told that body something upsetting, it would outwardly behave as though it were experiencing an emotion. Are you claiming that it would not in fact be experiencing an emotion?


Wait, are you claiming astrology is real and that the moon landings were faked?

No of course you're not, and I'm not claiming anything about a full human replica so please don't put words in my mouth that way.

We're not talking about a replica of a body. We're talking about LLMs. They don't have bodies and can't be moved, which is the definition of emotion.

And I'm not sure what you mean that my context window is a day. That's a strange thing to say. I'm deeply affected by childhood traumas several decades after they happened. They affect the tension patterns in my body every day. An LLM isn't affected even within its context window regardless of its length. They're only affected in the sense that a microwave is affected by setting it to defrost or a calculator is affected by setting it up use RPN.

If you built a perfect replica of a human then it would feel emotions just as a human. But that's not what we're talking about is it? There's a saying in my country: If your granny had balls she would be your granda. We can argue all day about "what if x" and "what if y" but we might be better served focusing on the reality we actually have.


tfa is not talking about AIs of now but of future. Broaden your imagination where we will be in 10/15 years, not 30 days.


You continue as if nothing happened even though <insert really bad event from 10 years ago>.

By the way, AI will react and have larger context window soon(ish). Then what?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: