Recent comments in /f/Futurology

Iffykindofguy t1_j9gcg0f wrote

Im guessing youre a straight white man? Thats not an insult, you just seem insulated. Brother I dont know how to tell you this but the status quo for 50 years was roe v wade. The status quo was mail in voting. The status quo was you could at least say the word gay in class if a kid asked about it. These werent where I wanted them to be, but you are not paying attention. Wake the fuck up.

0

dragonblade_94 t1_j9gbwsi wrote

Your argument is honestly just a half-dozen ways of rephrasing "machines cannot feel" without really positing why any of your points lead to this assumption.

>you can feel pain, a physical sensation. you can feel sad, an emotion. not interchangeable, but they are both things you feel. one is tangible, one is not.

I feel like the 'tangibility' question is vague to the point of being moot in this context. Both being stabbed and grieving over a loss are the brain processing signals caused by stimuli. I would definitely classify one as largely more complex of a brain activity than the other, but I don't believe there's an inherent difference other than which part of the brain is doing the processing. I can see a philisophical argument being made over their separation, but you would need to go a lot more in depth to explain why exactly the difference is valid.

> sensors and programming/algorithms acting as if they are nerves does not equal nerves

Why? Legitimately, why would an apparatus that does the exact same job as nerves not be equivalent other than material makeup? Given your response to the homunculus example, I have to assume you don't consider the choice of materials to be important to the question, so I'm curious as to your reasoning.

> the ai also can not catch the flu and feel like it is nauseous because it doesn’t have any parts that can be nauseous, nor does it have any living cells for the virus to infect thereby presenting the opportunity to make it nauseous

I'm not sure I see the point here. Commenting on robotics being resistant to disease doesn't really mean much in a discussion about sentience.

>AI chooses based on inputs/outputs, we do not choose to feel what we feel, that decision was made for us, not by us.

If looking through a purely deterministic perspective, this is exactly how humans operate as well; everything we think and feel is caused by chemical reactions and stimuli bound by the laws of physics. But that doesn't prevent us from feeling emotion, it just implies that feeling was inevitable.

>and if you could build a human cell by cell, yes, that would be a “machine"

If i'm interpreting your argument correctly, is your view that the existence of an intentioned creator is what defines an 'unfeeling' being from one that feels?

1

sergius64 t1_j9gam3r wrote

Life is getting tougher everywhere, just in slightly different ways. Americans like to pretend their living situation is getting so bad... but if you roll some random country generator - you'll find that living conditions in the resulting country are much worse.

Trouble with USA is that it's very expensive, has very little government support for healthcare and has a workaholic culture. On the plus side - earning potential here is one of the highest.

My father kept moving around trying to find an ideal place to live, after a lot of work they relocated to New Zealand. Lived and worked there for a while - found out that pensions there are tiny and the country was getting expensive. Had to move back. Now retired to Bulgaria, it's cheap and fairly nice - but turned out that Healthcare is awful and there are year long lines to the clinics with good Healthcare. Plus a lot of beurocracy and they almost force you to learn Bulgarian to get through it.

It's very difficult to move permanently to a lot of countries and be able to work. Many nations don't really allow it, or only allow it for some professions and you have to do a lot of work to get through their systems. And that's before language barrier comes into play.

1

onyxengine t1_j9g9096 wrote

You can never guarantee that some thing capable of a thing will never do that thing. If you want ai to remain harmless, then you have to construct them in such a way that they can’t do physical harm.

And that ship has sailed. Most militaries are testing AI for scouting and targeting and we even have Weaponized law-enforcement robots in the pipeline. San Francisco is the program that I’m currently aware of, I am sure there is more.

Even the linguistic models are extremely dangerous. Language is the command line script for humans and malicious people can program ai to convince people to do things that cause harm.

We’re not at the point where we need to worry about AI taking independent action to harm humans, but on the way there is plenty of room for humans to cause plenty of harm with AI.

Until we build agi that has extremely sophisticated levels of agency, every time an Ai hurts a human being it’s going to be because a human wanted it to be the case or overlooked cases in which what they were doing could be harmful.

1

69inthe619 t1_j9g8r93 wrote

you can feel pain, a physical sensation. you can feel sad, an emotion. not interchangeable, but they are both things you feel. one is tangible, one is not. a machine, or aI network, can not feel pain, it can describe pain. sensors and programming/algorithms acting as if they are nerves does not equal nerves. and believe it or not, the ai also can not catch the flu and feel like it is nauseous because it doesn’t have any parts that can be nauseous, nor does it have any living cells for the virus to infect thereby presenting the opportunity to make it nauseous. can AI say it is sad about something? of course. but can it be clinically depressed, no. AI chooses based on inputs/outputs, we do not choose to feel what we feel, that decision was made for us, not by us. and if you could build a human cell by cell, yes, that would be a “machine”, a biological machine, and that would make you God which would beg the question, wtf are you doing on reddit bruh?!

1

sschepis t1_j9g4krs wrote

Yes that's correct - sentience is a relative, assigned quality. We recognize an appearance as possessing the qualities of sentience, but this is a purely subjective experience.

This means, literally, that sentience is relative - just like time. Our perception of sentience is completely constrained by our perspective.

This means that 'sentience' is just like Schrodinger's cat, and all things exist in a state which is both sentient and not sentient at the same time.

This is proof that matter exists within consciousness, not that consciousness arises from the activity of matter

7

sschepis t1_j9g3c3y wrote

When you use the word 'sentient' do you use it in reference to yourself?

If you do not - if you only consider the word 'sentient' in relation to other people, as most people do - then you are describing a quality that you assign to others, not some inherent 'thing' that you can measure in yourself. 'Sentience' in this context is the same as 'handsome' or 'funny' - it's a completely relative, arbitrary term which is purely an effect of your perspective.

The truth is that consciousness is the ultimate indeterminate quantity, because it is indeterminacy itself, because only conscious systems can make choices that are counter to the principle of conservation of energy.

Because of this - because of the fact that literally everything can be perceived to be sentient - it means that everything is conscious because everything is potentially sentient.

1