Recent comments in /f/Futurology

Desperate_Food7354 t1_j9iq35x wrote

Yes. Are you programming a human? Is that what you want? Why do you even get up everyday? It’s completely illogical beyond the perspective of doing tasks that give you feelings of “goodness” which typically are for the purposes of achieving reproduction.

1

Desperate_Food7354 t1_j9iprju wrote

Emotions are the condition for intelligence in the biological world, without them you will die and you won’t reproduce, and you would have no reason to do anything at all as there is no reason to do anything in the first place beyond what feelings drive your behavior due to the programming that was passed onto you that insures genetic transfer. An AI would likely have emotions to the extent of it needing to achieve correct answers in order to feed back that no this answer is wrong = negative stimuli, this answer is correct = positive stimuli, but no it will not need all of our emotions. But if you are also asking can you code an ai to be exactly like us to the extent it’s practically a human with the full range of emotions, i see no reason why not.

1

Desperate_Food7354 t1_j9ipcct wrote

How are emotions not logical? If you didn’t have sex the genes that allow things like you to exist wouldn’t exist, it’s completely logical. How is anger not logical? If you experienced no anger you wouldn’t defend yourself resulting in 0 sexual and 0 gene transfer.

−6

jawshoeaw t1_j9ip89n wrote

Maybe. I have been impressed with chatGPT , but mostly in its ability to replicate the tedious and practical. The things so many of us must do for a paycheck. You know that feeling that you love a song and wonder , will there ever be another song this good? Or a book where you’re literally depressed that it’s over and want to cry that nothing written will ever make you feel that way again? I don’t believe that will be reproduced by an AI . If it is I’m done

8

CaseyTS t1_j9io8yn wrote

The thing you were talking about was developing deep and unique insights about the human experience, from the comment. Yes, you can do that with a generative model that does not have subjective experience. It can intelligently and creatively synthesize information from vast amounts of documented human experience. That is literally what generative LLMs are designed to do - learn from humans and talk about it.

0

SandAndAlum t1_j9io0d3 wrote

There is the kinda-open question of whether there are physical phenomena that cannot be modelled as an information process. True randomness would be one. Free will (insofar as the phrase is at all well defined) would potentially be another.

If so then all physical phenomena are not reducable to information processes and "meaning" could be one.

3

EvilKatta t1_j9inlfj wrote

Predictably, you can't answer this question without defining emotions or at least the lack of emotions.

Let me try: emotions are an extrarational drive that informs the thinking process. This drive is consistent (i.e. follows some kind of logic), but doesn't come from the thought process. It co-pilots decision making, for example it "punishes" the rational mind for "wrong" decisions, "rewards" it for good and timely outcomes, etc.

Right now, AIs basically have their training and user prompts for that. In the future, self-guided AIs will have their training frameworks in place, like a set of moral values. So I think yes, one way you can describe it is "having emotions".

1