I think it's a legitimate question, because ultimately all brain activity is electrical and chemical signals. To say that some electrical signal objectively is or is not an emotion implies that there is some objective rule for deciding this -- but I'm not aware of any such rule, only longstanding conventions.
AI isn’t programmed to have emotions. Merely to replicate a semblance of a simulacrum of said sensations. Regardless of your considerations for the electrical signals, the models are just tab-completion, ad infinitum.
No, it completely misses the point. If you say something very upsetting to me it will genuinely affect me and my day and have negative consequences for myself and the people around me, because I will have an emotional reaction. You can't upset an AI because it doesn't have the capacity to be upset, it can only return some words and then continue on as if nothing happened.
I hope that makes sense. The underlying functionality of my emotions don't matter at all, only the impact.
AIs are affected by things you say to them while those things are in their context window. Your context window for bad news is about a day.
Why are you certain that you -- a physical system amounting to a set of complex electric and chemical signals -- have this capacity for "genuine emotion", while another physical system, that outwardly behaves much the same as you, does not?
If I made a replica of your body accurate down to the atomic scale, and told that body something upsetting, it would outwardly behave as though it were experiencing an emotion. Are you claiming that it would not in fact be experiencing an emotion?
Wait, are you claiming astrology is real and that the moon landings were faked?
No of course you're not, and I'm not claiming anything about a full human replica so please don't put words in my mouth that way.
We're not talking about a replica of a body. We're talking about LLMs. They don't have bodies and can't be moved, which is the definition of emotion.
And I'm not sure what you mean that my context window is a day. That's a strange thing to say. I'm deeply affected by childhood traumas several decades after they happened. They affect the tension patterns in my body every day. An LLM isn't affected even within its context window regardless of its length. They're only affected in the sense that a microwave is affected by setting it to defrost or a calculator is affected by setting it up use RPN.
If you built a perfect replica of a human then it would feel emotions just as a human. But that's not what we're talking about is it? There's a saying in my country: If your granny had balls she would be your granda. We can argue all day about "what if x" and "what if y" but we might be better served focusing on the reality we actually have.