Recent comments in /f/Futurology

SupremeEmperorNoms t1_j9fgfja wrote

At the rate it's going, we're looking to end up in the same situation as The Outer Worlds. It's not really going to be a dictatorship, it's going to be a corporate wonderland. The upper middle class and lower will essentially be born into debt and be required to work off that debt as it passes down generation to generation as the highest class continues to grow their generational wealth into infinitum until the day comes that some serious economic collapse causes the entire system to upend and either result in mass loss of life as the top class seek to maintain their way of life through desperate measures, or a war is fought between corporate entities just to lower the number of mouths to feed and then continue working the remaining population at a higher pace to ensure no loss in productivity.

0

Iffykindofguy t1_j9ffv86 wrote

I agree, dems and republicans prior to the mid 2010s were the same party. But the democrats are being taken over at the local level and at least pretend at the federal. Republicans will turn us into a christian theocracy, democrats want universal healthcare - maybe. You're the one trying to get internet points here boy, stay in your lane or start bringing facts.

−4

dragonblade_94 t1_j9ffhu2 wrote

Are we talking physical pain, like getting stabbed, or emotional grief?

If the former, there's no reason to think a machine cannot be designed to detect pain. In organics, it just boils down to nerve endings sending a signal that translates to "whatever I am currently experiencing is bad." We already have capacitive tech, it wouldn't be all that exceptional to throw it on the exo-layer of a robot and have it move away from anything that applies enough pressure/heat/charge/etc.

If the latter, that's just ground-level circular reasoning: "Robot's can't feel because you need to feel to feel, which robots cannot do."

1

Poncho_au t1_j9fff22 wrote

It’s fundamentally impossible they’ll be the #1 carried carrier globally. That’s due to physics. A wireless communication medium cannot be even competitive in comparison to a fibre optic communication medium so in high population density areas that fibre makes sense they’ll have low customer counts.
In rural areas though they could absolutely become the #1 ISP.

1

TulpagenicUNISS t1_j9fetdc wrote

It is really not. While both sides are not the same on the surface, both Republicans and Democrats participate in insider trading on the stock market, they both use capital to influence political decisions away from the best case for the people. The Republicans and the Democrats have been for a near century the same party, it has not been until recently where they have diverged enough to have actual disagreements on things. The belief that Democrats will save you from Republicans or Republicans will save you from Democrats is literally taking the steps to hand authoritarians reins to government. Which I mean cool I guess, you get internet points for not being the other guys, when you should not want want to be any of them.

11

Peace-Bone t1_j9fer5r wrote

This article is making a LOT of assumptions, not a lot of which are well founded. First of all, it's assuming largely that society as it is will still exist without huge changes and genetic engineering will not be major factors. Which is absurd, but a sane assumption to make for the sake of having an article that's coherent.

Beyond that, it's assumptions are too much based on a pop-culture view of 'caveman vs modern world' mentality. It assumes that 'anxiety and aggression' are selected against in a developed world, which isn't very well founded, and it assumes that anxiety and aggression are distinct, observable traits.

Furthermore, it's conflating 'societal beauty standards', 'personal sexual preference', 'person people want to breed with', and 'person people are likely to breed with' are all the same thing. Those are all separate groups. The first is a political thing more than anything divorced from the rest. And the second is a far cry from the third and forth.

And the assumption that gender differences will become more pronounced is not at all founded, really. We're in a trend of gender lines becoming more blurred. And it seems in direct opposition to it's vague concept that people will become 'more standardized' which seems to be the opposite of what's likely to happen.

5

TulpagenicUNISS t1_j9feeed wrote

America is already run by a collection of oligarchs, they buy up all the politicians or they themselves run for political seats. The ideas of aristocracy, feudalism, dictatorship, its all short sighted, these are hyper capitalist entities, that really don't give two flying flips about political alignment. All they care about is profit, and money. The sooner people realize that we aren't going to ever defeat them by having elections every 2 years the sooner we might make some progress.

0

bremidon t1_j9feat9 wrote

>Would the most sentient ai ever actually experience emotion or does it just think it is?

When you find yourself asking a question like this, change the sentence around to reference people and see if you can give a clear answer. Like this:

Do people ever actually experience emotion or do they just think they do?

And you have now just wandered into some extremely deep waters. Even if you can convince yourself that *your* emotions are real, how do you know that anyone else actually *feels* emotions? Maybe you are the only one.

And once you have thought about this long enough, you are almost certainly going to realize: we will never know for sure.

And that leads to the next really troublesome question: what are we going to do about it? Should we give digital agents the benefit of the doubt?

And even though I always say "there's always one in every crowd," it does not seem to help; it's like they can't help themselves. Still, here is my disclaimer: I do not think that any current digital agent is conscious, feels things, or anything of the sort. I am just not entirely certain what my reason here is.

And to the folks who heard a YouTube video about how transformers work and think that explains everything: it does not. We have a pretty good idea of how brain cells work in detail, but we have no idea how we get from some chemicals and potentials to consciousness. So just knowing how the building blocks work does not necessarily mean you have any insights as to how the system works. Emergent behavior is a thing.

1

theironlion245 t1_j9fe3zf wrote

How could you harm chatgpt? It doesn't feel pain, it doesn't get injured, it doesn't die, it can replicate itself indefinitely.

There are zeta bites of storage around the world and a massive world wide web, if it had access to the internet chatgpt can hide itself tomorrow and it would be near impossible to find it.

An AI advanced enough would be virtually impossible to kill. So no, it doesn't need emotions and no the entire human species as a whole wouldn't represent any danger to it.

2

Semifreak t1_j9fdf9h wrote

Heh.

We've seen movies where machines 'fear' being unplugged, but that's nonsense. You can tell an AI to erase itself and it would. Why would it have any preference whether or not electricity is moving through its transistors? 'Death' and 'life' are meaningless to artificial machines.

1

midnitelux t1_j9fadqm wrote

A sentient robot would still benefit from detecting danger, and if needed would need to create bonds to survive. It may not need food, but unless it was programmed to not care about itself, it would definitely not want to die.

2

BetoBarnassian t1_j9fac28 wrote

We would need a good physical definition of what emotions are in a general sense. I think emotions are simply an impetus to behave in a certain way. How we act is some type of weird aggregated calculus of all the different things we want/don't want with varying degrees of intensity. In this sense emotions are more fundamental than the idea of being "happy", "sad", or "angry" and are simply behavioural expressions used to get what we want. Why do people get frustrated? Usually because they have to deal with stuff they don't want to. What does frustration do? Motivates people to leave a situation or change it. When we enjoy things, we usually seek more of that thing. Yet life is complicated and we have to balance many desires/wants against others leading to situations where we do things we don't want to get things we do. So long as you can program in a way for an Ai to have goals/wants/desires/priorities then emotions (imo) are simply the attempt to achieve these goals, fulfil these desires etc. Will they feel happiness or sadness in the same way we do? Probably not, they don't and will unlikely be made to mimic human biology so there will be differences in emotional expression, but I do think they will an analogous expressions that serve similar purposes.

This is just my quick 2cents. I'm sure there are decent arguments to be made against this point but I think it's a reasonably valid premise.

1